ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, Nov 14 2023, 16:14:06) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file [WARNING]: running playbook inside collection fedora.linux_system_roles statically imported: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml statically imported: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml statically imported: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml statically imported: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml statically imported: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml statically imported: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'jsonl', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: tests_luks.yml ******************************************************* 1 plays in /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml PLAY [Test LUKS] *************************************************************** TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:2 Wednesday 18 March 2026 19:38:36 -0400 (0:00:00.388) 0:00:00.388 ******* ok: [managed-node4] META: ran handlers TASK [Enable FIPS mode] ******************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:20 Wednesday 18 March 2026 19:38:42 -0400 (0:00:06.330) 0:00:06.718 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reboot] ****************************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:28 Wednesday 18 March 2026 19:38:43 -0400 (0:00:00.486) 0:00:07.205 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Enable FIPS mode - 2] **************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:39 Wednesday 18 March 2026 19:38:43 -0400 (0:00:00.634) 0:00:07.839 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reboot - 2] ************************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:43 Wednesday 18 March 2026 19:38:44 -0400 (0:00:00.533) 0:00:08.373 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure dracut-fips] ****************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:53 Wednesday 18 March 2026 19:38:44 -0400 (0:00:00.418) 0:00:08.791 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Configure boot for FIPS] ************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:59 Wednesday 18 March 2026 19:38:45 -0400 (0:00:00.476) 0:00:09.268 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reboot - 3] ************************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:68 Wednesday 18 March 2026 19:38:45 -0400 (0:00:00.621) 0:00:09.889 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role] ************************************************************ task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:72 Wednesday 18 March 2026 19:38:46 -0400 (0:00:00.548) 0:00:10.438 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node4 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:24 Wednesday 18 March 2026 19:38:46 -0400 (0:00:00.429) 0:00:10.867 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:34 Wednesday 18 March 2026 19:38:47 -0400 (0:00:00.545) 0:00:11.413 ******* TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Wednesday 18 March 2026 19:38:48 -0400 (0:00:00.982) 0:00:12.395 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Wednesday 18 March 2026 19:38:48 -0400 (0:00:00.421) 0:00:12.817 ******* ok: [managed-node4] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Wednesday 18 March 2026 19:38:52 -0400 (0:00:04.013) 0:00:16.830 ******* skipping: [managed-node4] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node4] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node4] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-loop", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed-node4] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Wednesday 18 March 2026 19:38:53 -0400 (0:00:00.526) 0:00:17.357 ******* ok: [managed-node4] => { "changed": false, "stat": { "exists": false } } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Wednesday 18 March 2026 19:38:57 -0400 (0:00:04.100) 0:00:21.458 ******* ok: [managed-node4] => { "ansible_facts": { "__storage_is_ostree": false }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Wednesday 18 March 2026 19:38:57 -0400 (0:00:00.412) 0:00:21.870 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Wednesday 18 March 2026 19:38:57 -0400 (0:00:00.141) 0:00:22.012 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Wednesday 18 March 2026 19:38:58 -0400 (0:00:00.384) 0:00:22.396 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Wednesday 18 March 2026 19:38:58 -0400 (0:00:00.652) 0:00:23.049 ******* ok: [managed-node4] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-loop-2.18-5.el7.x86_64 providing libblockdev-loop is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Wednesday 18 March 2026 19:39:04 -0400 (0:00:05.529) 0:00:28.579 ******* ok: [managed-node4] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Wednesday 18 March 2026 19:39:04 -0400 (0:00:00.224) 0:00:28.804 ******* ok: [managed-node4] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Wednesday 18 March 2026 19:39:05 -0400 (0:00:00.296) 0:00:29.100 ******* ok: [managed-node4] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Wednesday 18 March 2026 19:39:09 -0400 (0:00:04.302) 0:00:33.403 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Wednesday 18 March 2026 19:39:09 -0400 (0:00:00.436) 0:00:33.839 ******* TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Wednesday 18 March 2026 19:39:10 -0400 (0:00:00.261) 0:00:34.101 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Wednesday 18 March 2026 19:39:10 -0400 (0:00:00.278) 0:00:34.380 ******* TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Wednesday 18 March 2026 19:39:10 -0400 (0:00:00.077) 0:00:34.457 ******* ok: [managed-node4] => { "changed": false, "rc": 0, "results": [ "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Wednesday 18 March 2026 19:39:13 -0400 (0:00:03.152) 0:00:37.610 ******* ok: [managed-node4] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Wednesday 18 March 2026 19:39:18 -0400 (0:00:04.628) 0:00:42.238 ******* TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Wednesday 18 March 2026 19:39:18 -0400 (0:00:00.472) 0:00:42.711 ******* ok: [managed-node4] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 Wednesday 18 March 2026 19:39:20 -0400 (0:00:02.352) 0:00:45.064 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Wednesday 18 March 2026 19:39:21 -0400 (0:00:00.264) 0:00:45.329 ******* ok: [managed-node4] => { "changed": false, "stat": { "atime": 1773876156.2443287, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "72884e3f126482c2d28276ff7c57744fa95eff91", "ctime": 1773876154.288323, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263969, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1773876154.288323, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1229, "uid": 0, "version": "18446744071680134064", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:97 Wednesday 18 March 2026 19:39:23 -0400 (0:00:02.207) 0:00:47.537 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Wednesday 18 March 2026 19:39:23 -0400 (0:00:00.418) 0:00:47.955 ******* TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Wednesday 18 March 2026 19:39:24 -0400 (0:00:00.389) 0:00:48.344 ******* ok: [managed-node4] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:130 Wednesday 18 March 2026 19:39:24 -0400 (0:00:00.344) 0:00:48.689 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:134 Wednesday 18 March 2026 19:39:24 -0400 (0:00:00.347) 0:00:49.037 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:150 Wednesday 18 March 2026 19:39:25 -0400 (0:00:00.320) 0:00:49.357 ******* TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Wednesday 18 March 2026 19:39:25 -0400 (0:00:00.429) 0:00:49.786 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166 Wednesday 18 March 2026 19:39:25 -0400 (0:00:00.262) 0:00:50.049 ******* TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:177 Wednesday 18 March 2026 19:39:26 -0400 (0:00:00.316) 0:00:50.366 ******* TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:189 Wednesday 18 March 2026 19:39:26 -0400 (0:00:00.213) 0:00:50.579 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 Wednesday 18 March 2026 19:39:26 -0400 (0:00:00.316) 0:00:50.895 ******* ok: [managed-node4] => { "changed": false, "stat": { "atime": 1773875772.7221103, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1718879272.062, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131079, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1718879026.308, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "18446744072852913879", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:202 Wednesday 18 March 2026 19:39:28 -0400 (0:00:01.457) 0:00:52.353 ******* TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 Wednesday 18 March 2026 19:39:28 -0400 (0:00:00.193) 0:00:52.546 ******* ok: [managed-node4] TASK [Get unused disks] ******************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:75 Wednesday 18 March 2026 19:39:30 -0400 (0:00:02.504) 0:00:55.050 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml for managed-node4 TASK [Ensure test packages] **************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:2 Wednesday 18 March 2026 19:39:31 -0400 (0:00:00.609) 0:00:55.659 ******* ok: [managed-node4] => { "changed": false, "rc": 0, "results": [ "util-linux-2.23.2-65.el7_9.1.x86_64 providing util-linux is already installed" ] } TASK [Find unused disks in the system] ***************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:11 Wednesday 18 March 2026 19:39:35 -0400 (0:00:03.499) 0:00:59.159 ******* ok: [managed-node4] => { "changed": false, "disks": [ "sda" ], "info": [ "Line: NAME=\"/dev/sda\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdb\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdc\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdd\" TYPE=\"disk\" SIZE=\"1099511627776\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sde\" TYPE=\"disk\" SIZE=\"1099511627776\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdf\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdg\" TYPE=\"disk\" SIZE=\"1099511627776\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdh\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdi\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/xvda\" TYPE=\"disk\" SIZE=\"268435456000\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/xvda1\" TYPE=\"part\" SIZE=\"268434390528\" FSTYPE=\"ext4\" LOG-SEC=\"512\"", "Line type [part] is not disk: NAME=\"/dev/xvda1\" TYPE=\"part\" SIZE=\"268434390528\" FSTYPE=\"ext4\" LOG-SEC=\"512\"", "filename [xvda1] is a partition", "Disk [/dev/xvda] attrs [{'fstype': '', 'type': 'disk', 'ssize': '512', 'size': '268435456000'}] has partitions" ] } TASK [Debug why there are no unused disks] ************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:20 Wednesday 18 March 2026 19:39:38 -0400 (0:00:03.811) 0:01:02.970 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set unused_disks if necessary] ******************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:29 Wednesday 18 March 2026 19:39:39 -0400 (0:00:00.288) 0:01:03.259 ******* ok: [managed-node4] => { "ansible_facts": { "unused_disks": [ "sda" ] }, "changed": false } TASK [Exit playbook when there's not enough unused disks in the system] ******** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:34 Wednesday 18 March 2026 19:39:39 -0400 (0:00:00.355) 0:01:03.614 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print unused disks] ****************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:39 Wednesday 18 March 2026 19:39:39 -0400 (0:00:00.362) 0:01:03.976 ******* ok: [managed-node4] => { "unused_disks": [ "sda" ] } TASK [Test for correct handling of new encrypted volume w/ no key] ************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:84 Wednesday 18 March 2026 19:39:40 -0400 (0:00:00.287) 0:01:04.264 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node4 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Wednesday 18 March 2026 19:39:40 -0400 (0:00:00.429) 0:01:04.694 ******* ok: [managed-node4] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Wednesday 18 March 2026 19:39:40 -0400 (0:00:00.334) 0:01:05.028 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node4 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:24 Wednesday 18 March 2026 19:39:41 -0400 (0:00:00.503) 0:01:05.531 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:34 Wednesday 18 March 2026 19:39:41 -0400 (0:00:00.265) 0:01:05.797 ******* TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Wednesday 18 March 2026 19:39:42 -0400 (0:00:00.328) 0:01:06.125 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Wednesday 18 March 2026 19:39:42 -0400 (0:00:00.329) 0:01:06.455 ******* ok: [managed-node4] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Wednesday 18 March 2026 19:39:44 -0400 (0:00:02.592) 0:01:09.047 ******* skipping: [managed-node4] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node4] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node4] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-loop", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed-node4] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Wednesday 18 March 2026 19:39:45 -0400 (0:00:00.774) 0:01:09.821 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Wednesday 18 March 2026 19:39:46 -0400 (0:00:00.420) 0:01:10.242 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Wednesday 18 March 2026 19:39:46 -0400 (0:00:00.312) 0:01:10.554 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Wednesday 18 March 2026 19:39:46 -0400 (0:00:00.320) 0:01:10.874 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Wednesday 18 March 2026 19:39:47 -0400 (0:00:00.268) 0:01:11.143 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Wednesday 18 March 2026 19:39:47 -0400 (0:00:00.837) 0:01:11.981 ******* ok: [managed-node4] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-loop-2.18-5.el7.x86_64 providing libblockdev-loop is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Wednesday 18 March 2026 19:39:51 -0400 (0:00:03.575) 0:01:15.556 ******* ok: [managed-node4] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Wednesday 18 March 2026 19:39:51 -0400 (0:00:00.274) 0:01:15.831 ******* ok: [managed-node4] => { "storage_volumes | d([])": [ { "disks": [ "sda" ], "encryption": true, "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Wednesday 18 March 2026 19:39:51 -0400 (0:00:00.219) 0:01:16.050 ******* ok: [managed-node4] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Wednesday 18 March 2026 19:39:57 -0400 (0:00:05.522) 0:01:21.572 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Wednesday 18 March 2026 19:39:57 -0400 (0:00:00.405) 0:01:21.978 ******* TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Wednesday 18 March 2026 19:39:58 -0400 (0:00:00.246) 0:01:22.224 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Wednesday 18 March 2026 19:39:58 -0400 (0:00:00.222) 0:01:22.447 ******* TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Wednesday 18 March 2026 19:39:58 -0400 (0:00:00.305) 0:01:22.752 ******* ok: [managed-node4] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed", "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Wednesday 18 March 2026 19:40:02 -0400 (0:00:03.411) 0:01:26.163 ******* ok: [managed-node4] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Wednesday 18 March 2026 19:40:05 -0400 (0:00:03.473) 0:01:29.637 ******* TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Wednesday 18 March 2026 19:40:05 -0400 (0:00:00.416) 0:01:30.054 ******* fatal: [managed-node4]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: encrypted volume 'foo' missing key/password TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111 Wednesday 18 March 2026 19:40:11 -0400 (0:00:05.884) 0:01:35.938 ******* fatal: [managed-node4]: FAILED! => { "changed": false } MSG: {u'_ansible_no_log': False, u'crypts': [], u'pools': [], u'leaves': [], u'changed': False, u'actions': [], u'failed': True, u'volumes': [], u'invocation': {u'module_args': {u'packages_only': False, u'uses_kmod_kvdo': True, u'disklabel_type': None, u'safe_mode': True, u'diskvolume_mkfs_option_map': {u'ext4': u'-F', u'ext3': u'-F', u'ext2': u'-F'}, u'pools': [], u'volumes': [{u'raid_metadata_version': None, u'raid_level': None, u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': 10737418240, u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': None, u'encryption': True, u'mount_device_identifier': u'uuid', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_mode': None, u'thin_pool_name': None, u'fs_overwrite_existing': True, u'encryption_key_size': None, u'deduplication': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'raid_stripe_size': None, u'cache_size': 0, u'mount_user': None, u'raid_spare_count': None, u'cache_mode': None, u'name': u'foo', u'mount_group': None, u'type': u'disk', u'disks': [u'sda'], u'cached': False, u'thin_pool_size': None, u'thin': None, u'mount_check': 0, u'mount_passno': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}], u'pool_defaults': {u'raid_metadata_version': None, u'encryption_cipher': None, u'encryption_key': None, u'encryption_luks_version': None, u'raid_spare_count': None, u'grow_to_fill': False, u'encryption_password': None, u'encryption': False, u'disks': [], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'shared': False, u'type': u'lvm', u'encryption_key_size': None, u'raid_chunk_size': None}, u'volume_defaults': {u'raid_metadata_version': None, u'raid_level': None, u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': 0, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'encryption': False, u'mount_device_identifier': u'uuid', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'thin_pool_name': None, u'type': u'lvm', u'encryption_key_size': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'raid_stripe_size': None, u'cache_size': 0, u'raid_spare_count': None, u'cache_mode': None, u'deduplication': None, u'cached': False, u'fs_overwrite_existing': True, u'disks': [], u'thin_pool_size': None, u'thin': None, u'mount_check': 0, u'mount_passno': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}, u'use_partitions': None}}, u'mounts': [], u'packages': [], u'msg': u"encrypted volume 'foo' missing key/password"} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Wednesday 18 March 2026 19:40:12 -0400 (0:00:00.254) 0:01:36.193 ******* TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:22 Wednesday 18 March 2026 19:40:12 -0400 (0:00:00.379) 0:01:36.572 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:27 Wednesday 18 March 2026 19:40:12 -0400 (0:00:00.436) 0:01:37.009 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:38 Wednesday 18 March 2026 19:40:13 -0400 (0:00:00.384) 0:01:37.393 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Create an encrypted disk volume w/ default fs] *************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:99 Wednesday 18 March 2026 19:40:13 -0400 (0:00:00.239) 0:01:37.632 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node4 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:24 Wednesday 18 March 2026 19:40:14 -0400 (0:00:00.623) 0:01:38.256 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:34 Wednesday 18 March 2026 19:40:14 -0400 (0:00:00.220) 0:01:38.477 ******* TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Wednesday 18 March 2026 19:40:14 -0400 (0:00:00.242) 0:01:38.719 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Wednesday 18 March 2026 19:40:14 -0400 (0:00:00.231) 0:01:38.951 ******* ok: [managed-node4] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Wednesday 18 March 2026 19:40:17 -0400 (0:00:02.373) 0:01:41.325 ******* skipping: [managed-node4] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node4] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node4] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-loop", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed-node4] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Wednesday 18 March 2026 19:40:17 -0400 (0:00:00.676) 0:01:42.001 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Wednesday 18 March 2026 19:40:18 -0400 (0:00:00.363) 0:01:42.365 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Wednesday 18 March 2026 19:40:18 -0400 (0:00:00.180) 0:01:42.546 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Wednesday 18 March 2026 19:40:18 -0400 (0:00:00.269) 0:01:42.815 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Wednesday 18 March 2026 19:40:19 -0400 (0:00:00.284) 0:01:43.100 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Wednesday 18 March 2026 19:40:19 -0400 (0:00:00.520) 0:01:43.620 ******* ok: [managed-node4] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-loop-2.18-5.el7.x86_64 providing libblockdev-loop is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Wednesday 18 March 2026 19:40:22 -0400 (0:00:03.126) 0:01:46.747 ******* ok: [managed-node4] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Wednesday 18 March 2026 19:40:22 -0400 (0:00:00.313) 0:01:47.061 ******* ok: [managed-node4] => { "storage_volumes | d([])": [ { "disks": [ "sda" ], "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Wednesday 18 March 2026 19:40:23 -0400 (0:00:00.363) 0:01:47.424 ******* ok: [managed-node4] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Wednesday 18 March 2026 19:40:28 -0400 (0:00:05.508) 0:01:52.933 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Wednesday 18 March 2026 19:40:29 -0400 (0:00:00.282) 0:01:53.215 ******* TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Wednesday 18 March 2026 19:40:29 -0400 (0:00:00.251) 0:01:53.467 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Wednesday 18 March 2026 19:40:29 -0400 (0:00:00.244) 0:01:53.712 ******* TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Wednesday 18 March 2026 19:40:29 -0400 (0:00:00.269) 0:01:53.982 ******* ok: [managed-node4] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed", "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Wednesday 18 March 2026 19:40:32 -0400 (0:00:03.034) 0:01:57.016 ******* ok: [managed-node4] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Wednesday 18 March 2026 19:40:36 -0400 (0:00:03.148) 0:02:00.164 ******* TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Wednesday 18 March 2026 19:40:36 -0400 (0:00:00.291) 0:02:00.456 ******* changed: [managed-node4] => { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-5dfb6f1c-458e-4933-922c-32b2268852fe", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-5dfb6f1c-458e-4933-922c-32b2268852fe", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-5dfb6f1c-458e-4933-922c-32b2268852fe", "password": "-", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-5dfb6f1c-458e-4933-922c-32b2268852fe" ], "mounts": [ { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-5dfb6f1c-458e-4933-922c-32b2268852fe", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "e2fsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/mapper/luks-5dfb6f1c-458e-4933-922c-32b2268852fe", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-5dfb6f1c-458e-4933-922c-32b2268852fe", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 Wednesday 18 March 2026 19:40:48 -0400 (0:00:11.991) 0:02:12.448 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Wednesday 18 March 2026 19:40:48 -0400 (0:00:00.454) 0:02:12.902 ******* ok: [managed-node4] => { "changed": false, "stat": { "atime": 1773876156.2443287, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "72884e3f126482c2d28276ff7c57744fa95eff91", "ctime": 1773876154.288323, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263969, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1773876154.288323, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1229, "uid": 0, "version": "18446744071680134064", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:97 Wednesday 18 March 2026 19:40:50 -0400 (0:00:02.002) 0:02:14.904 ******* ok: [managed-node4] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Wednesday 18 March 2026 19:40:53 -0400 (0:00:02.919) 0:02:17.824 ******* TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Wednesday 18 March 2026 19:40:54 -0400 (0:00:00.446) 0:02:18.270 ******* ok: [managed-node4] => { "blivet_output": { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-5dfb6f1c-458e-4933-922c-32b2268852fe", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-5dfb6f1c-458e-4933-922c-32b2268852fe", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-5dfb6f1c-458e-4933-922c-32b2268852fe", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-5dfb6f1c-458e-4933-922c-32b2268852fe" ], "mounts": [ { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-5dfb6f1c-458e-4933-922c-32b2268852fe", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "e2fsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/mapper/luks-5dfb6f1c-458e-4933-922c-32b2268852fe", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-5dfb6f1c-458e-4933-922c-32b2268852fe", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:130 Wednesday 18 March 2026 19:40:54 -0400 (0:00:00.450) 0:02:18.721 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:134 Wednesday 18 March 2026 19:40:54 -0400 (0:00:00.297) 0:02:19.019 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/mapper/luks-5dfb6f1c-458e-4933-922c-32b2268852fe", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-5dfb6f1c-458e-4933-922c-32b2268852fe", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:150 Wednesday 18 March 2026 19:40:55 -0400 (0:00:00.359) 0:02:19.379 ******* TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Wednesday 18 March 2026 19:40:55 -0400 (0:00:00.293) 0:02:19.672 ******* ok: [managed-node4] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166 Wednesday 18 March 2026 19:41:01 -0400 (0:00:06.294) 0:02:25.967 ******* changed: [managed-node4] => (item={u'src': u'/dev/mapper/luks-5dfb6f1c-458e-4933-922c-32b2268852fe', u'group': None, u'dump': 0, u'passno': 0, u'fstype': u'xfs', u'state': u'mounted', u'mode': None, u'owner': None, u'path': u'/opt/test1', u'opts': u'defaults'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-5dfb6f1c-458e-4933-922c-32b2268852fe", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-5dfb6f1c-458e-4933-922c-32b2268852fe" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:177 Wednesday 18 March 2026 19:41:06 -0400 (0:00:04.572) 0:02:30.540 ******* skipping: [managed-node4] => (item={u'src': u'/dev/mapper/luks-5dfb6f1c-458e-4933-922c-32b2268852fe', u'group': None, u'dump': 0, u'passno': 0, u'fstype': u'xfs', u'state': u'mounted', u'mode': None, u'owner': None, u'path': u'/opt/test1', u'opts': u'defaults'}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-5dfb6f1c-458e-4933-922c-32b2268852fe", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:189 Wednesday 18 March 2026 19:41:06 -0400 (0:00:00.352) 0:02:30.892 ******* ok: [managed-node4] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 Wednesday 18 March 2026 19:41:08 -0400 (0:00:02.060) 0:02:32.952 ******* ok: [managed-node4] => { "changed": false, "stat": { "atime": 1773875772.7221103, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1718879272.062, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131079, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1718879026.308, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "18446744072852913879", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:202 Wednesday 18 March 2026 19:41:10 -0400 (0:00:01.759) 0:02:34.712 ******* changed: [managed-node4] => (item={u'state': u'present', u'password': u'-', u'name': u'luks-5dfb6f1c-458e-4933-922c-32b2268852fe', u'backing_device': u'/dev/sda'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda", "name": "luks-5dfb6f1c-458e-4933-922c-32b2268852fe", "password": "-", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 Wednesday 18 March 2026 19:41:12 -0400 (0:00:01.903) 0:02:36.615 ******* ok: [managed-node4] TASK [Verify role results] ***************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:110 Wednesday 18 March 2026 19:41:14 -0400 (0:00:02.260) 0:02:38.876 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node4 TASK [Print out pool information] ********************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Wednesday 18 March 2026 19:41:15 -0400 (0:00:00.725) 0:02:39.601 ******* skipping: [managed-node4] => {} TASK [Print out volume information] ******************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Wednesday 18 March 2026 19:41:16 -0400 (0:00:00.884) 0:02:40.486 ******* ok: [managed-node4] => { "_storage_volumes_list": [ { "_device": "/dev/mapper/luks-5dfb6f1c-458e-4933-922c-32b2268852fe", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-5dfb6f1c-458e-4933-922c-32b2268852fe", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Wednesday 18 March 2026 19:41:16 -0400 (0:00:00.531) 0:02:41.017 ******* ok: [managed-node4] => { "changed": false, "info": { "/dev/mapper/luks-5dfb6f1c-458e-4933-922c-32b2268852fe": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-5dfb6f1c-458e-4933-922c-32b2268852fe", "size": "10G", "type": "crypt", "uuid": "e474acfe-c972-4ae0-a795-6f824663f19e" }, "/dev/sda": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "5dfb6f1c-458e-4933-922c-32b2268852fe" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "ext4", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "c7b7d6a5-fd01-4b9b-bcca-153eaff9d312" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Wednesday 18 March 2026 19:41:21 -0400 (0:00:04.379) 0:02:45.396 ******* ok: [managed-node4] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003657", "end": "2026-03-18 19:41:24.041810", "rc": 0, "start": "2026-03-18 19:41:24.038153" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Thu Jun 20 10:23:46 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk' # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info # UUID=c7b7d6a5-fd01-4b9b-bcca-153eaff9d312 / ext4 defaults 1 1 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-5dfb6f1c-458e-4933-922c-32b2268852fe /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Wednesday 18 March 2026 19:41:24 -0400 (0:00:03.116) 0:02:48.512 ******* ok: [managed-node4] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002674", "end": "2026-03-18 19:41:25.554338", "failed_when_result": false, "rc": 0, "start": "2026-03-18 19:41:25.551664" } STDOUT: luks-5dfb6f1c-458e-4933-922c-32b2268852fe /dev/sda - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Wednesday 18 March 2026 19:41:25 -0400 (0:00:01.475) 0:02:49.988 ******* TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Wednesday 18 March 2026 19:41:26 -0400 (0:00:00.309) 0:02:50.298 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node4 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Wednesday 18 March 2026 19:41:26 -0400 (0:00:00.505) 0:02:50.803 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Wednesday 18 March 2026 19:41:27 -0400 (0:00:00.495) 0:02:51.298 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node4 included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node4 included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node4 included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node4 included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node4 included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node4 included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node4 included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node4 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Wednesday 18 March 2026 19:41:29 -0400 (0:00:01.977) 0:02:53.275 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-5dfb6f1c-458e-4933-922c-32b2268852fe" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Wednesday 18 March 2026 19:41:29 -0400 (0:00:00.325) 0:02:53.601 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Wednesday 18 March 2026 19:41:29 -0400 (0:00:00.300) 0:02:53.901 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Wednesday 18 March 2026 19:41:30 -0400 (0:00:00.555) 0:02:54.457 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Wednesday 18 March 2026 19:41:30 -0400 (0:00:00.410) 0:02:54.868 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Wednesday 18 March 2026 19:41:31 -0400 (0:00:00.368) 0:02:55.236 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Wednesday 18 March 2026 19:41:31 -0400 (0:00:00.327) 0:02:55.564 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Wednesday 18 March 2026 19:41:31 -0400 (0:00:00.418) 0:02:55.983 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Wednesday 18 March 2026 19:41:32 -0400 (0:00:00.415) 0:02:56.398 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Wednesday 18 March 2026 19:41:32 -0400 (0:00:00.432) 0:02:56.831 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Wednesday 18 March 2026 19:41:33 -0400 (0:00:00.401) 0:02:57.232 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Wednesday 18 March 2026 19:41:33 -0400 (0:00:00.355) 0:02:57.588 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-5dfb6f1c-458e-4933-922c-32b2268852fe " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Wednesday 18 March 2026 19:41:34 -0400 (0:00:00.762) 0:02:58.351 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Wednesday 18 March 2026 19:41:34 -0400 (0:00:00.439) 0:02:58.790 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Wednesday 18 March 2026 19:41:35 -0400 (0:00:00.415) 0:02:59.206 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Wednesday 18 March 2026 19:41:35 -0400 (0:00:00.276) 0:02:59.483 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Wednesday 18 March 2026 19:41:35 -0400 (0:00:00.431) 0:02:59.914 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Wednesday 18 March 2026 19:41:36 -0400 (0:00:00.406) 0:03:00.321 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Wednesday 18 March 2026 19:41:36 -0400 (0:00:00.435) 0:03:00.757 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Wednesday 18 March 2026 19:41:37 -0400 (0:00:00.538) 0:03:01.296 ******* ok: [managed-node4] => { "changed": false, "stat": { "atime": 1773877247.8344932, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1773877247.8344932, "dev": 5, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 28971, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1773877247.8344932, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Wednesday 18 March 2026 19:41:39 -0400 (0:00:01.890) 0:03:03.186 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Wednesday 18 March 2026 19:41:39 -0400 (0:00:00.474) 0:03:03.660 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Wednesday 18 March 2026 19:41:39 -0400 (0:00:00.358) 0:03:04.018 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Wednesday 18 March 2026 19:41:40 -0400 (0:00:00.456) 0:03:04.475 ******* ok: [managed-node4] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Wednesday 18 March 2026 19:41:40 -0400 (0:00:00.387) 0:03:04.863 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Wednesday 18 March 2026 19:41:41 -0400 (0:00:00.264) 0:03:05.127 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Wednesday 18 March 2026 19:41:41 -0400 (0:00:00.440) 0:03:05.568 ******* ok: [managed-node4] => { "changed": false, "stat": { "atime": 1773877247.9424937, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1773877247.9424937, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 105737, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1773877247.9424937, "nlink": 1, "path": "/dev/mapper/luks-5dfb6f1c-458e-4933-922c-32b2268852fe", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Wednesday 18 March 2026 19:41:43 -0400 (0:00:02.013) 0:03:07.582 ******* ok: [managed-node4] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed" ] } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Wednesday 18 March 2026 19:41:46 -0400 (0:00:03.098) 0:03:10.680 ******* ok: [managed-node4] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/sda" ], "delta": "0:00:00.699549", "end": "2026-03-18 19:41:48.709033", "rc": 0, "start": "2026-03-18 19:41:48.009484" } STDOUT: LUKS header information for /dev/sda Version: 1 Cipher name: aes Cipher mode: xts-plain64 Hash spec: sha256 Payload offset: 8192 MK bits: 512 MK digest: 9f c2 98 2b bb a2 51 a3 c3 d3 1a 12 36 4c 51 63 3c 40 c0 22 MK salt: fc cf 84 91 5d 03 da 05 ff 88 80 6c d1 22 34 b9 fc f8 4f 5a e6 cb 2c c4 ee 43 fc 80 33 f4 d8 3f MK iterations: 23239 UUID: 5dfb6f1c-458e-4933-922c-32b2268852fe Key Slot 0: ENABLED Iterations: 371834 Salt: b4 c4 45 24 58 63 03 64 c9 be b3 78 be 52 84 f3 49 64 dd c9 66 56 e9 3e 52 ad 0a 0a e9 4e d2 a0 Key material offset: 8 AF stripes: 4000 Key Slot 1: DISABLED Key Slot 2: DISABLED Key Slot 3: DISABLED Key Slot 4: DISABLED Key Slot 5: DISABLED Key Slot 6: DISABLED Key Slot 7: DISABLED TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Wednesday 18 March 2026 19:41:49 -0400 (0:00:02.479) 0:03:13.160 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Wednesday 18 March 2026 19:41:49 -0400 (0:00:00.390) 0:03:13.550 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Wednesday 18 March 2026 19:41:49 -0400 (0:00:00.404) 0:03:13.955 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Wednesday 18 March 2026 19:41:50 -0400 (0:00:00.393) 0:03:14.348 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Wednesday 18 March 2026 19:41:50 -0400 (0:00:00.420) 0:03:14.769 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Wednesday 18 March 2026 19:41:51 -0400 (0:00:00.327) 0:03:15.097 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Wednesday 18 March 2026 19:41:51 -0400 (0:00:00.369) 0:03:15.466 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Wednesday 18 March 2026 19:41:51 -0400 (0:00:00.294) 0:03:15.760 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-5dfb6f1c-458e-4933-922c-32b2268852fe /dev/sda -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Wednesday 18 March 2026 19:41:52 -0400 (0:00:00.386) 0:03:16.147 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Wednesday 18 March 2026 19:41:52 -0400 (0:00:00.334) 0:03:16.481 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Wednesday 18 March 2026 19:41:52 -0400 (0:00:00.401) 0:03:16.883 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Wednesday 18 March 2026 19:41:53 -0400 (0:00:00.370) 0:03:17.254 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Wednesday 18 March 2026 19:41:53 -0400 (0:00:00.413) 0:03:17.667 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Wednesday 18 March 2026 19:41:53 -0400 (0:00:00.309) 0:03:17.977 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Wednesday 18 March 2026 19:41:54 -0400 (0:00:00.451) 0:03:18.429 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Wednesday 18 March 2026 19:41:54 -0400 (0:00:00.316) 0:03:18.745 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Wednesday 18 March 2026 19:41:54 -0400 (0:00:00.335) 0:03:19.080 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Wednesday 18 March 2026 19:41:55 -0400 (0:00:00.375) 0:03:19.456 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Wednesday 18 March 2026 19:41:56 -0400 (0:00:00.801) 0:03:20.257 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Wednesday 18 March 2026 19:41:56 -0400 (0:00:00.329) 0:03:20.586 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Wednesday 18 March 2026 19:41:56 -0400 (0:00:00.265) 0:03:20.852 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Wednesday 18 March 2026 19:41:57 -0400 (0:00:00.334) 0:03:21.186 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Wednesday 18 March 2026 19:41:57 -0400 (0:00:00.363) 0:03:21.549 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Wednesday 18 March 2026 19:41:57 -0400 (0:00:00.334) 0:03:21.884 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Wednesday 18 March 2026 19:41:58 -0400 (0:00:00.376) 0:03:22.260 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Wednesday 18 March 2026 19:41:58 -0400 (0:00:00.377) 0:03:22.638 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Wednesday 18 March 2026 19:41:58 -0400 (0:00:00.340) 0:03:22.979 ******* ok: [managed-node4] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Wednesday 18 March 2026 19:41:59 -0400 (0:00:00.374) 0:03:23.353 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Wednesday 18 March 2026 19:41:59 -0400 (0:00:00.277) 0:03:23.631 ******* skipping: [managed-node4] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Wednesday 18 March 2026 19:41:59 -0400 (0:00:00.335) 0:03:23.966 ******* skipping: [managed-node4] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Wednesday 18 March 2026 19:42:00 -0400 (0:00:00.369) 0:03:24.336 ******* skipping: [managed-node4] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Wednesday 18 March 2026 19:42:00 -0400 (0:00:00.403) 0:03:24.740 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Wednesday 18 March 2026 19:42:00 -0400 (0:00:00.329) 0:03:25.069 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Wednesday 18 March 2026 19:42:01 -0400 (0:00:00.419) 0:03:25.489 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Wednesday 18 March 2026 19:42:01 -0400 (0:00:00.515) 0:03:26.005 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Wednesday 18 March 2026 19:42:02 -0400 (0:00:00.305) 0:03:26.311 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Wednesday 18 March 2026 19:42:02 -0400 (0:00:00.363) 0:03:26.674 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Wednesday 18 March 2026 19:42:02 -0400 (0:00:00.308) 0:03:26.983 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Wednesday 18 March 2026 19:42:03 -0400 (0:00:00.321) 0:03:27.304 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Wednesday 18 March 2026 19:42:03 -0400 (0:00:00.304) 0:03:27.609 ******* skipping: [managed-node4] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Wednesday 18 March 2026 19:42:03 -0400 (0:00:00.346) 0:03:27.955 ******* skipping: [managed-node4] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Wednesday 18 March 2026 19:42:04 -0400 (0:00:00.337) 0:03:28.293 ******* skipping: [managed-node4] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Wednesday 18 March 2026 19:42:04 -0400 (0:00:00.341) 0:03:28.635 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Wednesday 18 March 2026 19:42:04 -0400 (0:00:00.282) 0:03:28.917 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Wednesday 18 March 2026 19:42:05 -0400 (0:00:00.321) 0:03:29.238 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Wednesday 18 March 2026 19:42:05 -0400 (0:00:00.301) 0:03:29.540 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Wednesday 18 March 2026 19:42:05 -0400 (0:00:00.287) 0:03:29.827 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Wednesday 18 March 2026 19:42:06 -0400 (0:00:00.335) 0:03:30.163 ******* ok: [managed-node4] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Wednesday 18 March 2026 19:42:06 -0400 (0:00:00.268) 0:03:30.432 ******* ok: [managed-node4] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Wednesday 18 March 2026 19:42:06 -0400 (0:00:00.292) 0:03:30.724 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Wednesday 18 March 2026 19:42:06 -0400 (0:00:00.356) 0:03:31.081 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Wednesday 18 March 2026 19:42:07 -0400 (0:00:00.325) 0:03:31.407 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Wednesday 18 March 2026 19:42:07 -0400 (0:00:00.183) 0:03:31.591 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Wednesday 18 March 2026 19:42:07 -0400 (0:00:00.262) 0:03:31.853 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Wednesday 18 March 2026 19:42:07 -0400 (0:00:00.179) 0:03:32.033 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Wednesday 18 March 2026 19:42:08 -0400 (0:00:00.226) 0:03:32.259 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Wednesday 18 March 2026 19:42:08 -0400 (0:00:00.223) 0:03:32.483 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Wednesday 18 March 2026 19:42:08 -0400 (0:00:00.248) 0:03:32.732 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Wednesday 18 March 2026 19:42:08 -0400 (0:00:00.317) 0:03:33.050 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Wednesday 18 March 2026 19:42:09 -0400 (0:00:00.308) 0:03:33.358 ******* changed: [managed-node4] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode] ********************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:116 Wednesday 18 March 2026 19:42:12 -0400 (0:00:03.673) 0:03:37.032 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node4 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Wednesday 18 March 2026 19:42:13 -0400 (0:00:00.589) 0:03:37.622 ******* ok: [managed-node4] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Wednesday 18 March 2026 19:42:14 -0400 (0:00:00.565) 0:03:38.188 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node4 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:24 Wednesday 18 March 2026 19:42:14 -0400 (0:00:00.357) 0:03:38.545 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:34 Wednesday 18 March 2026 19:42:14 -0400 (0:00:00.300) 0:03:38.845 ******* TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Wednesday 18 March 2026 19:42:15 -0400 (0:00:00.389) 0:03:39.235 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Wednesday 18 March 2026 19:42:15 -0400 (0:00:00.488) 0:03:39.723 ******* ok: [managed-node4] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Wednesday 18 March 2026 19:42:18 -0400 (0:00:02.938) 0:03:42.662 ******* skipping: [managed-node4] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node4] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node4] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-loop", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed-node4] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Wednesday 18 March 2026 19:42:19 -0400 (0:00:00.801) 0:03:43.463 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Wednesday 18 March 2026 19:42:19 -0400 (0:00:00.299) 0:03:43.762 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Wednesday 18 March 2026 19:42:20 -0400 (0:00:00.332) 0:03:44.094 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Wednesday 18 March 2026 19:42:20 -0400 (0:00:00.319) 0:03:44.414 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Wednesday 18 March 2026 19:42:20 -0400 (0:00:00.308) 0:03:44.723 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Wednesday 18 March 2026 19:42:21 -0400 (0:00:00.920) 0:03:45.644 ******* ok: [managed-node4] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-loop-2.18-5.el7.x86_64 providing libblockdev-loop is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Wednesday 18 March 2026 19:42:25 -0400 (0:00:03.931) 0:03:49.575 ******* ok: [managed-node4] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Wednesday 18 March 2026 19:42:25 -0400 (0:00:00.341) 0:03:49.916 ******* ok: [managed-node4] => { "storage_volumes | d([])": [ { "disks": [ "sda" ], "encryption": false, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Wednesday 18 March 2026 19:42:26 -0400 (0:00:00.438) 0:03:50.355 ******* ok: [managed-node4] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Wednesday 18 March 2026 19:42:32 -0400 (0:00:05.848) 0:03:56.203 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Wednesday 18 March 2026 19:42:32 -0400 (0:00:00.475) 0:03:56.679 ******* TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Wednesday 18 March 2026 19:42:32 -0400 (0:00:00.167) 0:03:56.847 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Wednesday 18 March 2026 19:42:33 -0400 (0:00:00.365) 0:03:57.213 ******* TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Wednesday 18 March 2026 19:42:33 -0400 (0:00:00.232) 0:03:57.445 ******* ok: [managed-node4] => { "changed": false, "rc": 0, "results": [ "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Wednesday 18 March 2026 19:42:36 -0400 (0:00:03.422) 0:04:00.868 ******* ok: [managed-node4] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Wednesday 18 March 2026 19:42:39 -0400 (0:00:03.011) 0:04:03.880 ******* TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Wednesday 18 March 2026 19:42:40 -0400 (0:00:00.490) 0:04:04.370 ******* fatal: [managed-node4]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'luks-5dfb6f1c-458e-4933-922c-32b2268852fe' in safe mode due to encryption removal TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111 Wednesday 18 March 2026 19:42:46 -0400 (0:00:05.756) 0:04:10.126 ******* fatal: [managed-node4]: FAILED! => { "changed": false } MSG: {u'_ansible_no_log': False, u'crypts': [], u'pools': [], u'leaves': [], u'changed': False, u'actions': [], u'failed': True, u'volumes': [], u'invocation': {u'module_args': {u'packages_only': False, u'uses_kmod_kvdo': True, u'disklabel_type': None, u'safe_mode': True, u'diskvolume_mkfs_option_map': {u'ext4': u'-F', u'ext3': u'-F', u'ext2': u'-F'}, u'pools': [], u'volumes': [{u'raid_metadata_version': None, u'raid_level': None, u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': 10733223936, u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': u'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', u'encryption': False, u'mount_device_identifier': u'uuid', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_mode': None, u'thin_pool_name': None, u'fs_overwrite_existing': True, u'encryption_key_size': 0, u'deduplication': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': u'luks1', u'raid_stripe_size': None, u'cache_size': 0, u'mount_user': None, u'raid_spare_count': None, u'cache_mode': None, u'name': u'foo', u'mount_group': None, u'type': u'disk', u'disks': [u'sda'], u'cached': False, u'thin_pool_size': None, u'thin': None, u'mount_check': 0, u'mount_passno': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}], u'pool_defaults': {u'raid_metadata_version': None, u'encryption_cipher': None, u'encryption_key': None, u'encryption_luks_version': None, u'raid_spare_count': None, u'grow_to_fill': False, u'encryption_password': None, u'encryption': False, u'disks': [], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'shared': False, u'type': u'lvm', u'encryption_key_size': None, u'raid_chunk_size': None}, u'volume_defaults': {u'raid_metadata_version': None, u'raid_level': None, u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': 0, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'encryption': False, u'mount_device_identifier': u'uuid', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'thin_pool_name': None, u'type': u'lvm', u'encryption_key_size': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'raid_stripe_size': None, u'cache_size': 0, u'raid_spare_count': None, u'cache_mode': None, u'deduplication': None, u'cached': False, u'fs_overwrite_existing': True, u'disks': [], u'thin_pool_size': None, u'thin': None, u'mount_check': 0, u'mount_passno': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}, u'use_partitions': None}}, u'mounts': [], u'packages': [], u'msg': u"cannot remove existing formatting on device 'luks-5dfb6f1c-458e-4933-922c-32b2268852fe' in safe mode due to encryption removal"} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Wednesday 18 March 2026 19:42:46 -0400 (0:00:00.253) 0:04:10.380 ******* TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:22 Wednesday 18 March 2026 19:42:46 -0400 (0:00:00.437) 0:04:10.818 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:27 Wednesday 18 March 2026 19:42:47 -0400 (0:00:00.296) 0:04:11.115 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:38 Wednesday 18 March 2026 19:42:47 -0400 (0:00:00.932) 0:04:12.048 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Wednesday 18 March 2026 19:42:48 -0400 (0:00:00.575) 0:04:12.706 ******* ok: [managed-node4] => { "changed": false, "stat": { "atime": 1773877332.5537343, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1773877332.5537343, "dev": 64768, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 67, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1773877332.5537343, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "188923336", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Wednesday 18 March 2026 19:42:50 -0400 (0:00:01.676) 0:04:14.382 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Remove the encryption layer] ********************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:136 Wednesday 18 March 2026 19:42:50 -0400 (0:00:00.325) 0:04:14.708 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node4 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:24 Wednesday 18 March 2026 19:42:51 -0400 (0:00:00.479) 0:04:15.188 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:34 Wednesday 18 March 2026 19:42:51 -0400 (0:00:00.152) 0:04:15.340 ******* TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Wednesday 18 March 2026 19:42:51 -0400 (0:00:00.218) 0:04:15.558 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Wednesday 18 March 2026 19:42:51 -0400 (0:00:00.313) 0:04:15.872 ******* ok: [managed-node4] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Wednesday 18 March 2026 19:42:53 -0400 (0:00:02.136) 0:04:18.008 ******* skipping: [managed-node4] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node4] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node4] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-loop", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed-node4] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Wednesday 18 March 2026 19:42:54 -0400 (0:00:00.589) 0:04:18.598 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Wednesday 18 March 2026 19:42:54 -0400 (0:00:00.367) 0:04:18.965 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Wednesday 18 March 2026 19:42:55 -0400 (0:00:00.284) 0:04:19.250 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Wednesday 18 March 2026 19:42:55 -0400 (0:00:00.309) 0:04:19.559 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Wednesday 18 March 2026 19:42:55 -0400 (0:00:00.248) 0:04:19.808 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Wednesday 18 March 2026 19:42:56 -0400 (0:00:00.649) 0:04:20.458 ******* ok: [managed-node4] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-loop-2.18-5.el7.x86_64 providing libblockdev-loop is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Wednesday 18 March 2026 19:42:59 -0400 (0:00:03.601) 0:04:24.059 ******* ok: [managed-node4] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Wednesday 18 March 2026 19:43:00 -0400 (0:00:00.268) 0:04:24.328 ******* ok: [managed-node4] => { "storage_volumes | d([])": [ { "disks": [ "sda" ], "encryption": false, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Wednesday 18 March 2026 19:43:00 -0400 (0:00:00.256) 0:04:24.585 ******* ok: [managed-node4] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Wednesday 18 March 2026 19:43:05 -0400 (0:00:05.456) 0:04:30.041 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Wednesday 18 March 2026 19:43:06 -0400 (0:00:00.344) 0:04:30.385 ******* TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Wednesday 18 March 2026 19:43:06 -0400 (0:00:00.280) 0:04:30.666 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Wednesday 18 March 2026 19:43:06 -0400 (0:00:00.257) 0:04:30.923 ******* TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Wednesday 18 March 2026 19:43:06 -0400 (0:00:00.150) 0:04:31.073 ******* ok: [managed-node4] => { "changed": false, "rc": 0, "results": [ "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Wednesday 18 March 2026 19:43:10 -0400 (0:00:03.161) 0:04:34.235 ******* ok: [managed-node4] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Wednesday 18 March 2026 19:43:12 -0400 (0:00:02.640) 0:04:36.876 ******* TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Wednesday 18 March 2026 19:43:13 -0400 (0:00:00.361) 0:04:37.237 ******* changed: [managed-node4] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-5dfb6f1c-458e-4933-922c-32b2268852fe", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-5dfb6f1c-458e-4933-922c-32b2268852fe", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-5dfb6f1c-458e-4933-922c-32b2268852fe", "password": "-", "state": "absent" } ], "leaves": [ "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-5dfb6f1c-458e-4933-922c-32b2268852fe", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=ebb7dd01-98d5-44b8-a9c2-20d4d4b00459", "state": "mounted" } ], "packages": [ "xfsprogs", "e2fsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=ebb7dd01-98d5-44b8-a9c2-20d4d4b00459", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10733223936, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 Wednesday 18 March 2026 19:43:18 -0400 (0:00:05.835) 0:04:43.073 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Wednesday 18 March 2026 19:43:19 -0400 (0:00:00.323) 0:04:43.396 ******* ok: [managed-node4] => { "changed": false, "stat": { "atime": 1773877265.9285445, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "992bc8be6d1b63071f8e8b938be00ab0c1e129d8", "ctime": 1773877265.9255447, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263969, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1773877265.9255447, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1311, "uid": 0, "version": "18446744071680134064", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:97 Wednesday 18 March 2026 19:43:21 -0400 (0:00:01.768) 0:04:45.165 ******* ok: [managed-node4] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Wednesday 18 March 2026 19:43:22 -0400 (0:00:01.638) 0:04:46.803 ******* TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Wednesday 18 March 2026 19:43:23 -0400 (0:00:00.472) 0:04:47.276 ******* ok: [managed-node4] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-5dfb6f1c-458e-4933-922c-32b2268852fe", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-5dfb6f1c-458e-4933-922c-32b2268852fe", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-5dfb6f1c-458e-4933-922c-32b2268852fe", "password": "-", "state": "absent" } ], "failed": false, "leaves": [ "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-5dfb6f1c-458e-4933-922c-32b2268852fe", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=ebb7dd01-98d5-44b8-a9c2-20d4d4b00459", "state": "mounted" } ], "packages": [ "xfsprogs", "e2fsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=ebb7dd01-98d5-44b8-a9c2-20d4d4b00459", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10733223936, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:130 Wednesday 18 March 2026 19:43:23 -0400 (0:00:00.392) 0:04:47.669 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:134 Wednesday 18 March 2026 19:43:23 -0400 (0:00:00.339) 0:04:48.008 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=ebb7dd01-98d5-44b8-a9c2-20d4d4b00459", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10733223936, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:150 Wednesday 18 March 2026 19:43:24 -0400 (0:00:00.401) 0:04:48.409 ******* changed: [managed-node4] => (item={u'src': u'/dev/mapper/luks-5dfb6f1c-458e-4933-922c-32b2268852fe', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-5dfb6f1c-458e-4933-922c-32b2268852fe", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-5dfb6f1c-458e-4933-922c-32b2268852fe" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Wednesday 18 March 2026 19:43:26 -0400 (0:00:01.787) 0:04:50.197 ******* ok: [managed-node4] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166 Wednesday 18 March 2026 19:43:28 -0400 (0:00:01.898) 0:04:52.096 ******* changed: [managed-node4] => (item={u'src': u'UUID=ebb7dd01-98d5-44b8-a9c2-20d4d4b00459', u'group': None, u'dump': 0, u'passno': 0, u'fstype': u'xfs', u'state': u'mounted', u'mode': None, u'owner': None, u'path': u'/opt/test1', u'opts': u'defaults'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=ebb7dd01-98d5-44b8-a9c2-20d4d4b00459", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=ebb7dd01-98d5-44b8-a9c2-20d4d4b00459" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:177 Wednesday 18 March 2026 19:43:29 -0400 (0:00:01.828) 0:04:53.925 ******* skipping: [managed-node4] => (item={u'src': u'UUID=ebb7dd01-98d5-44b8-a9c2-20d4d4b00459', u'group': None, u'dump': 0, u'passno': 0, u'fstype': u'xfs', u'state': u'mounted', u'mode': None, u'owner': None, u'path': u'/opt/test1', u'opts': u'defaults'}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=ebb7dd01-98d5-44b8-a9c2-20d4d4b00459", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:189 Wednesday 18 March 2026 19:43:30 -0400 (0:00:00.414) 0:04:54.339 ******* ok: [managed-node4] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 Wednesday 18 March 2026 19:43:32 -0400 (0:00:01.761) 0:04:56.101 ******* ok: [managed-node4] => { "changed": false, "stat": { "atime": 1773877285.5536003, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "be11ac4a703887cdc3aa3aca5fc7a386e61bd18e", "ctime": 1773877272.1685624, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 264032, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1773877272.1685624, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 53, "uid": 0, "version": "18446744071680135212", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:202 Wednesday 18 March 2026 19:43:33 -0400 (0:00:01.744) 0:04:57.845 ******* changed: [managed-node4] => (item={u'state': u'absent', u'password': u'-', u'name': u'luks-5dfb6f1c-458e-4933-922c-32b2268852fe', u'backing_device': u'/dev/sda'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda", "name": "luks-5dfb6f1c-458e-4933-922c-32b2268852fe", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 Wednesday 18 March 2026 19:43:35 -0400 (0:00:02.127) 0:04:59.973 ******* ok: [managed-node4] TASK [Verify role results - 2] ************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:148 Wednesday 18 March 2026 19:43:39 -0400 (0:00:03.303) 0:05:03.277 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node4 TASK [Print out pool information] ********************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Wednesday 18 March 2026 19:43:40 -0400 (0:00:00.922) 0:05:04.199 ******* skipping: [managed-node4] => {} TASK [Print out volume information] ******************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Wednesday 18 March 2026 19:43:40 -0400 (0:00:00.229) 0:05:04.429 ******* ok: [managed-node4] => { "_storage_volumes_list": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=ebb7dd01-98d5-44b8-a9c2-20d4d4b00459", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10733223936, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Wednesday 18 March 2026 19:43:40 -0400 (0:00:00.441) 0:05:04.870 ******* ok: [managed-node4] => { "changed": false, "info": { "/dev/sda": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "ebb7dd01-98d5-44b8-a9c2-20d4d4b00459" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "ext4", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "c7b7d6a5-fd01-4b9b-bcca-153eaff9d312" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Wednesday 18 March 2026 19:43:43 -0400 (0:00:02.637) 0:05:07.508 ******* ok: [managed-node4] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002583", "end": "2026-03-18 19:43:44.954462", "rc": 0, "start": "2026-03-18 19:43:44.951879" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Thu Jun 20 10:23:46 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk' # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info # UUID=c7b7d6a5-fd01-4b9b-bcca-153eaff9d312 / ext4 defaults 1 1 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 UUID=ebb7dd01-98d5-44b8-a9c2-20d4d4b00459 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Wednesday 18 March 2026 19:43:45 -0400 (0:00:01.920) 0:05:09.429 ******* ok: [managed-node4] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002971", "end": "2026-03-18 19:43:46.551243", "failed_when_result": false, "rc": 0, "start": "2026-03-18 19:43:46.548272" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Wednesday 18 March 2026 19:43:46 -0400 (0:00:01.524) 0:05:10.953 ******* TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Wednesday 18 March 2026 19:43:47 -0400 (0:00:00.257) 0:05:11.210 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node4 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Wednesday 18 March 2026 19:43:47 -0400 (0:00:00.676) 0:05:11.887 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Wednesday 18 March 2026 19:43:48 -0400 (0:00:00.512) 0:05:12.399 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node4 included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node4 included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node4 included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node4 included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node4 included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node4 included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node4 included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node4 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Wednesday 18 March 2026 19:43:50 -0400 (0:00:01.847) 0:05:14.246 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_device_path": "/dev/sda" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Wednesday 18 March 2026 19:43:50 -0400 (0:00:00.436) 0:05:14.683 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Wednesday 18 March 2026 19:43:50 -0400 (0:00:00.400) 0:05:15.083 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Wednesday 18 March 2026 19:43:51 -0400 (0:00:00.497) 0:05:15.581 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Wednesday 18 March 2026 19:43:52 -0400 (0:00:00.565) 0:05:16.146 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Wednesday 18 March 2026 19:43:52 -0400 (0:00:00.407) 0:05:16.554 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Wednesday 18 March 2026 19:43:52 -0400 (0:00:00.358) 0:05:16.912 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Wednesday 18 March 2026 19:43:53 -0400 (0:00:00.380) 0:05:17.293 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Wednesday 18 March 2026 19:43:53 -0400 (0:00:00.429) 0:05:17.723 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Wednesday 18 March 2026 19:43:54 -0400 (0:00:00.507) 0:05:18.230 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Wednesday 18 March 2026 19:43:54 -0400 (0:00:00.379) 0:05:18.610 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Wednesday 18 March 2026 19:43:54 -0400 (0:00:00.316) 0:05:18.927 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "UUID=ebb7dd01-98d5-44b8-a9c2-20d4d4b00459 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Wednesday 18 March 2026 19:43:55 -0400 (0:00:00.687) 0:05:19.614 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Wednesday 18 March 2026 19:43:55 -0400 (0:00:00.400) 0:05:20.015 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Wednesday 18 March 2026 19:43:56 -0400 (0:00:00.302) 0:05:20.318 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Wednesday 18 March 2026 19:43:56 -0400 (0:00:00.353) 0:05:20.672 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Wednesday 18 March 2026 19:43:57 -0400 (0:00:00.585) 0:05:21.257 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Wednesday 18 March 2026 19:43:57 -0400 (0:00:00.411) 0:05:21.669 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Wednesday 18 March 2026 19:43:58 -0400 (0:00:00.619) 0:05:22.288 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Wednesday 18 March 2026 19:43:58 -0400 (0:00:00.469) 0:05:22.758 ******* ok: [managed-node4] => { "changed": false, "stat": { "atime": 1773877398.570924, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1773877398.570924, "dev": 5, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 28971, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1773877398.570924, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Wednesday 18 March 2026 19:44:00 -0400 (0:00:02.043) 0:05:24.801 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Wednesday 18 March 2026 19:44:01 -0400 (0:00:00.518) 0:05:25.320 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Wednesday 18 March 2026 19:44:01 -0400 (0:00:00.322) 0:05:25.642 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Wednesday 18 March 2026 19:44:01 -0400 (0:00:00.445) 0:05:26.088 ******* ok: [managed-node4] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Wednesday 18 March 2026 19:44:02 -0400 (0:00:00.438) 0:05:26.526 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Wednesday 18 March 2026 19:44:02 -0400 (0:00:00.353) 0:05:26.880 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Wednesday 18 March 2026 19:44:03 -0400 (0:00:00.469) 0:05:27.350 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Wednesday 18 March 2026 19:44:03 -0400 (0:00:00.289) 0:05:27.639 ******* ok: [managed-node4] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed" ] } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Wednesday 18 March 2026 19:44:07 -0400 (0:00:03.818) 0:05:31.458 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Wednesday 18 March 2026 19:44:07 -0400 (0:00:00.320) 0:05:31.778 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Wednesday 18 March 2026 19:44:08 -0400 (0:00:00.330) 0:05:32.108 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Wednesday 18 March 2026 19:44:08 -0400 (0:00:00.525) 0:05:32.633 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Wednesday 18 March 2026 19:44:08 -0400 (0:00:00.322) 0:05:32.955 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Wednesday 18 March 2026 19:44:09 -0400 (0:00:00.336) 0:05:33.292 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Wednesday 18 March 2026 19:44:09 -0400 (0:00:00.270) 0:05:33.562 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Wednesday 18 March 2026 19:44:09 -0400 (0:00:00.329) 0:05:33.892 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Wednesday 18 March 2026 19:44:10 -0400 (0:00:00.380) 0:05:34.273 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Wednesday 18 March 2026 19:44:10 -0400 (0:00:00.421) 0:05:34.694 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Wednesday 18 March 2026 19:44:10 -0400 (0:00:00.381) 0:05:35.076 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Wednesday 18 March 2026 19:44:11 -0400 (0:00:00.367) 0:05:35.444 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Wednesday 18 March 2026 19:44:11 -0400 (0:00:00.370) 0:05:35.814 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Wednesday 18 March 2026 19:44:12 -0400 (0:00:00.453) 0:05:36.268 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Wednesday 18 March 2026 19:44:12 -0400 (0:00:00.393) 0:05:36.662 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Wednesday 18 March 2026 19:44:12 -0400 (0:00:00.325) 0:05:36.987 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Wednesday 18 March 2026 19:44:13 -0400 (0:00:00.397) 0:05:37.384 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Wednesday 18 March 2026 19:44:13 -0400 (0:00:00.323) 0:05:37.708 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Wednesday 18 March 2026 19:44:13 -0400 (0:00:00.337) 0:05:38.046 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Wednesday 18 March 2026 19:44:14 -0400 (0:00:00.353) 0:05:38.400 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Wednesday 18 March 2026 19:44:14 -0400 (0:00:00.333) 0:05:38.734 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Wednesday 18 March 2026 19:44:14 -0400 (0:00:00.347) 0:05:39.082 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Wednesday 18 March 2026 19:44:15 -0400 (0:00:00.340) 0:05:39.422 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Wednesday 18 March 2026 19:44:15 -0400 (0:00:00.375) 0:05:39.798 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Wednesday 18 March 2026 19:44:16 -0400 (0:00:00.367) 0:05:40.165 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Wednesday 18 March 2026 19:44:16 -0400 (0:00:00.391) 0:05:40.556 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Wednesday 18 March 2026 19:44:16 -0400 (0:00:00.350) 0:05:40.907 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Wednesday 18 March 2026 19:44:17 -0400 (0:00:00.342) 0:05:41.249 ******* ok: [managed-node4] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Wednesday 18 March 2026 19:44:17 -0400 (0:00:00.306) 0:05:41.556 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Wednesday 18 March 2026 19:44:17 -0400 (0:00:00.391) 0:05:41.948 ******* skipping: [managed-node4] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Wednesday 18 March 2026 19:44:18 -0400 (0:00:00.389) 0:05:42.337 ******* skipping: [managed-node4] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Wednesday 18 March 2026 19:44:18 -0400 (0:00:00.414) 0:05:42.752 ******* skipping: [managed-node4] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Wednesday 18 March 2026 19:44:18 -0400 (0:00:00.322) 0:05:43.074 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Wednesday 18 March 2026 19:44:19 -0400 (0:00:00.321) 0:05:43.396 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Wednesday 18 March 2026 19:44:19 -0400 (0:00:00.325) 0:05:43.721 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Wednesday 18 March 2026 19:44:19 -0400 (0:00:00.326) 0:05:44.048 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Wednesday 18 March 2026 19:44:20 -0400 (0:00:00.351) 0:05:44.400 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Wednesday 18 March 2026 19:44:20 -0400 (0:00:00.198) 0:05:44.598 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Wednesday 18 March 2026 19:44:20 -0400 (0:00:00.216) 0:05:44.815 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Wednesday 18 March 2026 19:44:21 -0400 (0:00:00.352) 0:05:45.167 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Wednesday 18 March 2026 19:44:21 -0400 (0:00:00.309) 0:05:45.477 ******* skipping: [managed-node4] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Wednesday 18 March 2026 19:44:21 -0400 (0:00:00.203) 0:05:45.680 ******* skipping: [managed-node4] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Wednesday 18 March 2026 19:44:21 -0400 (0:00:00.349) 0:05:46.030 ******* skipping: [managed-node4] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Wednesday 18 March 2026 19:44:22 -0400 (0:00:00.266) 0:05:46.296 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Wednesday 18 March 2026 19:44:22 -0400 (0:00:00.314) 0:05:46.611 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Wednesday 18 March 2026 19:44:22 -0400 (0:00:00.304) 0:05:46.916 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Wednesday 18 March 2026 19:44:23 -0400 (0:00:00.393) 0:05:47.309 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Wednesday 18 March 2026 19:44:23 -0400 (0:00:00.266) 0:05:47.576 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Wednesday 18 March 2026 19:44:23 -0400 (0:00:00.339) 0:05:47.915 ******* ok: [managed-node4] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Wednesday 18 March 2026 19:44:24 -0400 (0:00:00.306) 0:05:48.222 ******* ok: [managed-node4] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Wednesday 18 March 2026 19:44:24 -0400 (0:00:00.286) 0:05:48.508 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Wednesday 18 March 2026 19:44:24 -0400 (0:00:00.344) 0:05:48.853 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Wednesday 18 March 2026 19:44:25 -0400 (0:00:00.274) 0:05:49.128 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Wednesday 18 March 2026 19:44:25 -0400 (0:00:00.315) 0:05:49.443 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Wednesday 18 March 2026 19:44:26 -0400 (0:00:00.911) 0:05:50.354 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Wednesday 18 March 2026 19:44:26 -0400 (0:00:00.279) 0:05:50.634 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Wednesday 18 March 2026 19:44:26 -0400 (0:00:00.306) 0:05:50.940 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Wednesday 18 March 2026 19:44:27 -0400 (0:00:00.286) 0:05:51.227 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Wednesday 18 March 2026 19:44:27 -0400 (0:00:00.380) 0:05:51.607 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Wednesday 18 March 2026 19:44:27 -0400 (0:00:00.335) 0:05:51.943 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Wednesday 18 March 2026 19:44:28 -0400 (0:00:00.217) 0:05:52.161 ******* changed: [managed-node4] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode - 2] ****************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:154 Wednesday 18 March 2026 19:44:29 -0400 (0:00:01.780) 0:05:53.941 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node4 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Wednesday 18 March 2026 19:44:30 -0400 (0:00:00.586) 0:05:54.528 ******* ok: [managed-node4] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Wednesday 18 March 2026 19:44:30 -0400 (0:00:00.380) 0:05:54.908 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node4 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:24 Wednesday 18 March 2026 19:44:31 -0400 (0:00:00.275) 0:05:55.184 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:34 Wednesday 18 March 2026 19:44:31 -0400 (0:00:00.286) 0:05:55.471 ******* TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Wednesday 18 March 2026 19:44:31 -0400 (0:00:00.217) 0:05:55.689 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Wednesday 18 March 2026 19:44:31 -0400 (0:00:00.375) 0:05:56.064 ******* ok: [managed-node4] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Wednesday 18 March 2026 19:44:34 -0400 (0:00:02.803) 0:05:58.868 ******* skipping: [managed-node4] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node4] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node4] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-loop", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed-node4] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Wednesday 18 March 2026 19:44:35 -0400 (0:00:00.690) 0:05:59.559 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Wednesday 18 March 2026 19:44:35 -0400 (0:00:00.472) 0:06:00.032 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Wednesday 18 March 2026 19:44:36 -0400 (0:00:00.257) 0:06:00.290 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Wednesday 18 March 2026 19:44:36 -0400 (0:00:00.313) 0:06:00.603 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Wednesday 18 March 2026 19:44:36 -0400 (0:00:00.353) 0:06:00.956 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Wednesday 18 March 2026 19:44:37 -0400 (0:00:00.823) 0:06:01.780 ******* ok: [managed-node4] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-loop-2.18-5.el7.x86_64 providing libblockdev-loop is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Wednesday 18 March 2026 19:44:40 -0400 (0:00:03.275) 0:06:05.055 ******* ok: [managed-node4] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Wednesday 18 March 2026 19:44:41 -0400 (0:00:00.375) 0:06:05.431 ******* ok: [managed-node4] => { "storage_volumes | d([])": [ { "disks": [ "sda" ], "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Wednesday 18 March 2026 19:44:41 -0400 (0:00:00.212) 0:06:05.644 ******* ok: [managed-node4] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Wednesday 18 March 2026 19:44:46 -0400 (0:00:05.214) 0:06:10.859 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Wednesday 18 March 2026 19:44:47 -0400 (0:00:00.410) 0:06:11.270 ******* TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Wednesday 18 March 2026 19:44:47 -0400 (0:00:00.143) 0:06:11.413 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Wednesday 18 March 2026 19:44:47 -0400 (0:00:00.242) 0:06:11.655 ******* TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Wednesday 18 March 2026 19:44:47 -0400 (0:00:00.195) 0:06:11.851 ******* ok: [managed-node4] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed", "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Wednesday 18 March 2026 19:44:50 -0400 (0:00:02.742) 0:06:14.593 ******* ok: [managed-node4] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-cryptsetup@luks\\x2d5dfb6f1c\\x2d458e\\x2d4933\\x2d922c\\x2d32b2268852fe.service": { "name": "systemd-cryptsetup@luks\\x2d5dfb6f1c\\x2d458e\\x2d4933\\x2d922c\\x2d32b2268852fe.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Wednesday 18 March 2026 19:44:53 -0400 (0:00:02.769) 0:06:17.363 ******* changed: [managed-node4] => (item=systemd-cryptsetup@luks\x2d5dfb6f1c\x2d458e\x2d4933\x2d922c\x2d32b2268852fe.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d5dfb6f1c\\x2d458e\\x2d4933\\x2d922c\\x2d32b2268852fe.service", "name": "systemd-cryptsetup@luks\\x2d5dfb6f1c\\x2d458e\\x2d4933\\x2d922c\\x2d32b2268852fe.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "system-systemd\\x2dcryptsetup.slice systemd-readahead-collect.service systemd-readahead-replay.service systemd-journald.socket cryptsetup-pre.target dev-sda.device", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "umount.target cryptsetup.target", "BindsTo": "dev-sda.device", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "Delegate": "no", "Description": "Cryptography Setup for luks-5dfb6f1c-458e-4933-922c-32b2268852fe", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-5dfb6f1c-458e-4933-922c-32b2268852fe /dev/sda - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-5dfb6f1c-458e-4933-922c-32b2268852fe ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d5dfb6f1c\\x2d458e\\x2d4933\\x2d922c\\x2d32b2268852fe.service", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2d5dfb6f1c\\x2d458e\\x2d4933\\x2d922c\\x2d32b2268852fe.service", "IgnoreOnIsolate": "yes", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "4096", "LimitNPROC": "14311", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "14311", "LimitSTACK": "18446744073709551615", "LoadState": "loaded", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2d5dfb6f1c\\x2d458e\\x2d4933\\x2d922c\\x2d32b2268852fe.service", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RequiredBy": "cryptsetup.target", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "0", "TimeoutStopUSec": "0", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "bad", "WantedBy": "dev-sda.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Wednesday 18 March 2026 19:44:55 -0400 (0:00:02.065) 0:06:19.429 ******* fatal: [managed-node4]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'sda' in safe mode due to adding encryption TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111 Wednesday 18 March 2026 19:45:01 -0400 (0:00:05.781) 0:06:25.211 ******* fatal: [managed-node4]: FAILED! => { "changed": false } MSG: {u'_ansible_no_log': False, u'crypts': [], u'pools': [], u'leaves': [], u'changed': False, u'actions': [], u'failed': True, u'volumes': [], u'invocation': {u'module_args': {u'packages_only': False, u'uses_kmod_kvdo': True, u'disklabel_type': None, u'safe_mode': True, u'diskvolume_mkfs_option_map': {u'ext4': u'-F', u'ext3': u'-F', u'ext2': u'-F'}, u'pools': [], u'volumes': [{u'raid_metadata_version': None, u'raid_level': None, u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': 10737418240, u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': u'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', u'encryption': True, u'mount_device_identifier': u'uuid', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_mode': None, u'thin_pool_name': None, u'fs_overwrite_existing': True, u'encryption_key_size': None, u'deduplication': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'raid_stripe_size': None, u'cache_size': 0, u'mount_user': None, u'raid_spare_count': None, u'cache_mode': None, u'name': u'foo', u'mount_group': None, u'type': u'disk', u'disks': [u'sda'], u'cached': False, u'thin_pool_size': None, u'thin': None, u'mount_check': 0, u'mount_passno': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}], u'pool_defaults': {u'raid_metadata_version': None, u'encryption_cipher': None, u'encryption_key': None, u'encryption_luks_version': None, u'raid_spare_count': None, u'grow_to_fill': False, u'encryption_password': None, u'encryption': False, u'disks': [], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'shared': False, u'type': u'lvm', u'encryption_key_size': None, u'raid_chunk_size': None}, u'volume_defaults': {u'raid_metadata_version': None, u'raid_level': None, u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': 0, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'encryption': False, u'mount_device_identifier': u'uuid', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'thin_pool_name': None, u'type': u'lvm', u'encryption_key_size': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'raid_stripe_size': None, u'cache_size': 0, u'raid_spare_count': None, u'cache_mode': None, u'deduplication': None, u'cached': False, u'fs_overwrite_existing': True, u'disks': [], u'thin_pool_size': None, u'thin': None, u'mount_check': 0, u'mount_passno': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}, u'use_partitions': None}}, u'mounts': [], u'packages': [], u'msg': u"cannot remove existing formatting on device 'sda' in safe mode due to adding encryption"} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Wednesday 18 March 2026 19:45:01 -0400 (0:00:00.475) 0:06:25.687 ******* changed: [managed-node4] => (item=systemd-cryptsetup@luks\x2d5dfb6f1c\x2d458e\x2d4933\x2d922c\x2d32b2268852fe.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d5dfb6f1c\\x2d458e\\x2d4933\\x2d922c\\x2d32b2268852fe.service", "name": "systemd-cryptsetup@luks\\x2d5dfb6f1c\\x2d458e\\x2d4933\\x2d922c\\x2d32b2268852fe.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ControlPID": "0", "DefaultDependencies": "yes", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d5dfb6f1c\\x2d458e\\x2d4933\\x2d922c\\x2d32b2268852fe.service", "DevicePolicy": "auto", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2d5dfb6f1c\\x2d458e\\x2d4933\\x2d922c\\x2d32b2268852fe.service", "IgnoreOnIsolate": "no", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "1048576", "LimitNPROC": "14311", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "14311", "LimitSTACK": "18446744073709551615", "LoadState": "masked", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2d5dfb6f1c\\x2d458e\\x2d4933\\x2d922c\\x2d32b2268852fe.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UMask": "0022", "UnitFileState": "bad", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:22 Wednesday 18 March 2026 19:45:04 -0400 (0:00:02.456) 0:06:28.143 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:27 Wednesday 18 March 2026 19:45:04 -0400 (0:00:00.405) 0:06:28.548 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:38 Wednesday 18 March 2026 19:45:04 -0400 (0:00:00.483) 0:06:29.031 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Wednesday 18 March 2026 19:45:05 -0400 (0:00:00.427) 0:06:29.459 ******* ok: [managed-node4] => { "changed": false, "stat": { "atime": 1773877469.5751276, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1773877469.5751276, "dev": 2048, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 67, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1773877469.5751276, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "1529131658", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Wednesday 18 March 2026 19:45:07 -0400 (0:00:01.940) 0:06:31.400 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Add encryption to the volume] ******************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:174 Wednesday 18 March 2026 19:45:07 -0400 (0:00:00.357) 0:06:31.758 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node4 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:24 Wednesday 18 March 2026 19:45:08 -0400 (0:00:00.818) 0:06:32.576 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:34 Wednesday 18 March 2026 19:45:08 -0400 (0:00:00.179) 0:06:32.756 ******* TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Wednesday 18 March 2026 19:45:08 -0400 (0:00:00.235) 0:06:32.991 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Wednesday 18 March 2026 19:45:09 -0400 (0:00:00.392) 0:06:33.384 ******* ok: [managed-node4] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Wednesday 18 March 2026 19:45:11 -0400 (0:00:02.343) 0:06:35.727 ******* skipping: [managed-node4] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node4] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node4] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-loop", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed-node4] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Wednesday 18 March 2026 19:45:12 -0400 (0:00:00.652) 0:06:36.380 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Wednesday 18 March 2026 19:45:12 -0400 (0:00:00.292) 0:06:36.672 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Wednesday 18 March 2026 19:45:12 -0400 (0:00:00.321) 0:06:36.993 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Wednesday 18 March 2026 19:45:13 -0400 (0:00:00.276) 0:06:37.270 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Wednesday 18 March 2026 19:45:13 -0400 (0:00:00.177) 0:06:37.447 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Wednesday 18 March 2026 19:45:14 -0400 (0:00:01.437) 0:06:38.884 ******* ok: [managed-node4] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-loop-2.18-5.el7.x86_64 providing libblockdev-loop is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Wednesday 18 March 2026 19:45:17 -0400 (0:00:02.775) 0:06:41.659 ******* ok: [managed-node4] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Wednesday 18 March 2026 19:45:17 -0400 (0:00:00.312) 0:06:41.972 ******* ok: [managed-node4] => { "storage_volumes | d([])": [ { "disks": [ "sda" ], "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Wednesday 18 March 2026 19:45:18 -0400 (0:00:00.327) 0:06:42.299 ******* ok: [managed-node4] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Wednesday 18 March 2026 19:45:23 -0400 (0:00:05.446) 0:06:47.746 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Wednesday 18 March 2026 19:45:24 -0400 (0:00:00.468) 0:06:48.216 ******* TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Wednesday 18 March 2026 19:45:24 -0400 (0:00:00.117) 0:06:48.334 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Wednesday 18 March 2026 19:45:24 -0400 (0:00:00.160) 0:06:48.495 ******* TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Wednesday 18 March 2026 19:45:24 -0400 (0:00:00.102) 0:06:48.597 ******* ok: [managed-node4] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed", "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Wednesday 18 March 2026 19:45:26 -0400 (0:00:02.421) 0:06:51.018 ******* ok: [managed-node4] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Wednesday 18 March 2026 19:45:29 -0400 (0:00:02.919) 0:06:53.938 ******* TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Wednesday 18 March 2026 19:45:30 -0400 (0:00:00.376) 0:06:54.315 ******* changed: [managed-node4] => { "actions": [ { "action": "destroy format", "device": "/dev/sda", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-245394af-792c-404d-9215-db78c843f58b", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-245394af-792c-404d-9215-db78c843f58b", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-245394af-792c-404d-9215-db78c843f58b", "password": "-", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-245394af-792c-404d-9215-db78c843f58b" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=ebb7dd01-98d5-44b8-a9c2-20d4d4b00459", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-245394af-792c-404d-9215-db78c843f58b", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "e2fsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/mapper/luks-245394af-792c-404d-9215-db78c843f58b", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-245394af-792c-404d-9215-db78c843f58b", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 Wednesday 18 March 2026 19:45:42 -0400 (0:00:12.041) 0:07:06.356 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Wednesday 18 March 2026 19:45:42 -0400 (0:00:00.483) 0:07:06.839 ******* ok: [managed-node4] => { "changed": false, "stat": { "atime": 1773877409.5879555, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "ed450e27ccfb6a825ce32c7cb333fd6e24c41d66", "ctime": 1773877409.5849555, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263969, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1773877409.5849555, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1299, "uid": 0, "version": "18446744071680134064", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:97 Wednesday 18 March 2026 19:45:44 -0400 (0:00:01.882) 0:07:08.722 ******* ok: [managed-node4] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Wednesday 18 March 2026 19:45:46 -0400 (0:00:01.833) 0:07:10.556 ******* TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Wednesday 18 March 2026 19:45:46 -0400 (0:00:00.424) 0:07:10.980 ******* ok: [managed-node4] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/sda", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-245394af-792c-404d-9215-db78c843f58b", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-245394af-792c-404d-9215-db78c843f58b", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-245394af-792c-404d-9215-db78c843f58b", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-245394af-792c-404d-9215-db78c843f58b" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=ebb7dd01-98d5-44b8-a9c2-20d4d4b00459", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-245394af-792c-404d-9215-db78c843f58b", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "e2fsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/mapper/luks-245394af-792c-404d-9215-db78c843f58b", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-245394af-792c-404d-9215-db78c843f58b", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:130 Wednesday 18 March 2026 19:45:47 -0400 (0:00:00.398) 0:07:11.379 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:134 Wednesday 18 March 2026 19:45:47 -0400 (0:00:00.361) 0:07:11.740 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/mapper/luks-245394af-792c-404d-9215-db78c843f58b", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-245394af-792c-404d-9215-db78c843f58b", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:150 Wednesday 18 March 2026 19:45:48 -0400 (0:00:00.471) 0:07:12.211 ******* changed: [managed-node4] => (item={u'src': u'UUID=ebb7dd01-98d5-44b8-a9c2-20d4d4b00459', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=ebb7dd01-98d5-44b8-a9c2-20d4d4b00459", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=ebb7dd01-98d5-44b8-a9c2-20d4d4b00459" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Wednesday 18 March 2026 19:45:50 -0400 (0:00:02.007) 0:07:14.219 ******* ok: [managed-node4] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166 Wednesday 18 March 2026 19:45:52 -0400 (0:00:02.007) 0:07:16.226 ******* changed: [managed-node4] => (item={u'src': u'/dev/mapper/luks-245394af-792c-404d-9215-db78c843f58b', u'group': None, u'dump': 0, u'passno': 0, u'fstype': u'xfs', u'state': u'mounted', u'mode': None, u'owner': None, u'path': u'/opt/test1', u'opts': u'defaults'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-245394af-792c-404d-9215-db78c843f58b", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-245394af-792c-404d-9215-db78c843f58b" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:177 Wednesday 18 March 2026 19:45:54 -0400 (0:00:01.941) 0:07:18.167 ******* skipping: [managed-node4] => (item={u'src': u'/dev/mapper/luks-245394af-792c-404d-9215-db78c843f58b', u'group': None, u'dump': 0, u'passno': 0, u'fstype': u'xfs', u'state': u'mounted', u'mode': None, u'owner': None, u'path': u'/opt/test1', u'opts': u'defaults'}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-245394af-792c-404d-9215-db78c843f58b", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:189 Wednesday 18 March 2026 19:45:54 -0400 (0:00:00.419) 0:07:18.587 ******* ok: [managed-node4] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 Wednesday 18 March 2026 19:45:56 -0400 (0:00:02.041) 0:07:20.628 ******* ok: [managed-node4] => { "changed": false, "stat": { "atime": 1773877426.5500042, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1773877415.4829724, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 264033, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1773877415.4829724, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "18446744071680135380", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:202 Wednesday 18 March 2026 19:45:58 -0400 (0:00:01.644) 0:07:22.273 ******* changed: [managed-node4] => (item={u'state': u'present', u'password': u'-', u'name': u'luks-245394af-792c-404d-9215-db78c843f58b', u'backing_device': u'/dev/sda'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda", "name": "luks-245394af-792c-404d-9215-db78c843f58b", "password": "-", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 Wednesday 18 March 2026 19:46:00 -0400 (0:00:01.934) 0:07:24.207 ******* ok: [managed-node4] TASK [Verify role results - 3] ************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:186 Wednesday 18 March 2026 19:46:02 -0400 (0:00:02.483) 0:07:26.691 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node4 TASK [Print out pool information] ********************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Wednesday 18 March 2026 19:46:03 -0400 (0:00:00.806) 0:07:27.497 ******* skipping: [managed-node4] => {} TASK [Print out volume information] ******************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Wednesday 18 March 2026 19:46:03 -0400 (0:00:00.290) 0:07:27.787 ******* ok: [managed-node4] => { "_storage_volumes_list": [ { "_device": "/dev/mapper/luks-245394af-792c-404d-9215-db78c843f58b", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-245394af-792c-404d-9215-db78c843f58b", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Wednesday 18 March 2026 19:46:04 -0400 (0:00:00.358) 0:07:28.146 ******* ok: [managed-node4] => { "changed": false, "info": { "/dev/mapper/luks-245394af-792c-404d-9215-db78c843f58b": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-245394af-792c-404d-9215-db78c843f58b", "size": "10G", "type": "crypt", "uuid": "efa4b88a-81bf-427a-bf59-ca7e6d188d7f" }, "/dev/sda": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "245394af-792c-404d-9215-db78c843f58b" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "ext4", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "c7b7d6a5-fd01-4b9b-bcca-153eaff9d312" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Wednesday 18 March 2026 19:46:05 -0400 (0:00:01.924) 0:07:30.070 ******* ok: [managed-node4] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002738", "end": "2026-03-18 19:46:07.298969", "rc": 0, "start": "2026-03-18 19:46:07.296231" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Thu Jun 20 10:23:46 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk' # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info # UUID=c7b7d6a5-fd01-4b9b-bcca-153eaff9d312 / ext4 defaults 1 1 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-245394af-792c-404d-9215-db78c843f58b /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Wednesday 18 March 2026 19:46:07 -0400 (0:00:01.678) 0:07:31.749 ******* ok: [managed-node4] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003552", "end": "2026-03-18 19:46:08.975397", "failed_when_result": false, "rc": 0, "start": "2026-03-18 19:46:08.971845" } STDOUT: luks-245394af-792c-404d-9215-db78c843f58b /dev/sda - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Wednesday 18 March 2026 19:46:09 -0400 (0:00:01.668) 0:07:33.417 ******* TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Wednesday 18 March 2026 19:46:09 -0400 (0:00:00.234) 0:07:33.652 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node4 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Wednesday 18 March 2026 19:46:10 -0400 (0:00:00.561) 0:07:34.214 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Wednesday 18 March 2026 19:46:10 -0400 (0:00:00.385) 0:07:34.599 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node4 included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node4 included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node4 included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node4 included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node4 included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node4 included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node4 included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node4 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Wednesday 18 March 2026 19:46:12 -0400 (0:00:02.174) 0:07:36.774 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-245394af-792c-404d-9215-db78c843f58b" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Wednesday 18 March 2026 19:46:13 -0400 (0:00:00.326) 0:07:37.100 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Wednesday 18 March 2026 19:46:13 -0400 (0:00:00.392) 0:07:37.493 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Wednesday 18 March 2026 19:46:13 -0400 (0:00:00.394) 0:07:37.888 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Wednesday 18 March 2026 19:46:14 -0400 (0:00:00.310) 0:07:38.198 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Wednesday 18 March 2026 19:46:14 -0400 (0:00:00.325) 0:07:38.524 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Wednesday 18 March 2026 19:46:14 -0400 (0:00:00.329) 0:07:38.853 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Wednesday 18 March 2026 19:46:15 -0400 (0:00:00.376) 0:07:39.229 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Wednesday 18 March 2026 19:46:15 -0400 (0:00:00.331) 0:07:39.561 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Wednesday 18 March 2026 19:46:15 -0400 (0:00:00.226) 0:07:39.788 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Wednesday 18 March 2026 19:46:15 -0400 (0:00:00.195) 0:07:39.984 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Wednesday 18 March 2026 19:46:16 -0400 (0:00:00.297) 0:07:40.281 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-245394af-792c-404d-9215-db78c843f58b " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Wednesday 18 March 2026 19:46:16 -0400 (0:00:00.540) 0:07:40.821 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Wednesday 18 March 2026 19:46:17 -0400 (0:00:00.364) 0:07:41.186 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Wednesday 18 March 2026 19:46:17 -0400 (0:00:00.448) 0:07:41.635 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Wednesday 18 March 2026 19:46:17 -0400 (0:00:00.269) 0:07:41.904 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Wednesday 18 March 2026 19:46:18 -0400 (0:00:00.338) 0:07:42.242 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Wednesday 18 March 2026 19:46:18 -0400 (0:00:00.363) 0:07:42.606 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Wednesday 18 March 2026 19:46:19 -0400 (0:00:00.567) 0:07:43.173 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Wednesday 18 March 2026 19:46:19 -0400 (0:00:00.713) 0:07:43.887 ******* ok: [managed-node4] => { "changed": false, "stat": { "atime": 1773877541.5573337, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1773877541.5573337, "dev": 5, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 28971, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1773877541.5573337, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Wednesday 18 March 2026 19:46:21 -0400 (0:00:01.961) 0:07:45.849 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Wednesday 18 March 2026 19:46:22 -0400 (0:00:00.427) 0:07:46.276 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Wednesday 18 March 2026 19:46:22 -0400 (0:00:00.349) 0:07:46.625 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Wednesday 18 March 2026 19:46:23 -0400 (0:00:00.480) 0:07:47.106 ******* ok: [managed-node4] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Wednesday 18 March 2026 19:46:23 -0400 (0:00:00.341) 0:07:47.447 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Wednesday 18 March 2026 19:46:23 -0400 (0:00:00.400) 0:07:47.847 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Wednesday 18 March 2026 19:46:24 -0400 (0:00:00.434) 0:07:48.281 ******* ok: [managed-node4] => { "changed": false, "stat": { "atime": 1773877541.665334, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1773877541.665334, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 124814, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1773877541.665334, "nlink": 1, "path": "/dev/mapper/luks-245394af-792c-404d-9215-db78c843f58b", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Wednesday 18 March 2026 19:46:26 -0400 (0:00:01.818) 0:07:50.100 ******* ok: [managed-node4] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed" ] } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Wednesday 18 March 2026 19:46:29 -0400 (0:00:03.484) 0:07:53.585 ******* ok: [managed-node4] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/sda" ], "delta": "0:00:00.025048", "end": "2026-03-18 19:46:31.049727", "rc": 0, "start": "2026-03-18 19:46:31.024679" } STDOUT: LUKS header information for /dev/sda Version: 1 Cipher name: aes Cipher mode: xts-plain64 Hash spec: sha256 Payload offset: 8192 MK bits: 512 MK digest: b2 33 4d 7a 5f 24 b7 59 5f 27 34 5d f1 1d de 2f b2 5a dd b3 MK salt: 7b 38 08 29 a4 70 c1 8d 54 48 52 2c 9f 4b 84 16 a8 44 1a 63 47 a9 48 d5 3a e6 b0 1f 94 4d c3 43 MK iterations: 23043 UUID: 245394af-792c-404d-9215-db78c843f58b Key Slot 0: ENABLED Iterations: 368178 Salt: c1 a6 4f 60 44 78 ec ed 82 53 63 3a d7 6c 19 23 f0 2f 79 bf bd ac 9b cc 8e 4d 4f b1 05 1b 33 03 Key material offset: 8 AF stripes: 4000 Key Slot 1: DISABLED Key Slot 2: DISABLED Key Slot 3: DISABLED Key Slot 4: DISABLED Key Slot 5: DISABLED Key Slot 6: DISABLED Key Slot 7: DISABLED TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Wednesday 18 March 2026 19:46:31 -0400 (0:00:01.846) 0:07:55.431 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Wednesday 18 March 2026 19:46:31 -0400 (0:00:00.354) 0:07:55.785 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Wednesday 18 March 2026 19:46:32 -0400 (0:00:00.499) 0:07:56.285 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Wednesday 18 March 2026 19:46:32 -0400 (0:00:00.414) 0:07:56.699 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Wednesday 18 March 2026 19:46:32 -0400 (0:00:00.378) 0:07:57.078 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Wednesday 18 March 2026 19:46:33 -0400 (0:00:00.252) 0:07:57.331 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Wednesday 18 March 2026 19:46:33 -0400 (0:00:00.359) 0:07:57.690 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Wednesday 18 March 2026 19:46:33 -0400 (0:00:00.363) 0:07:58.054 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-245394af-792c-404d-9215-db78c843f58b /dev/sda -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Wednesday 18 March 2026 19:46:34 -0400 (0:00:00.477) 0:07:58.531 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Wednesday 18 March 2026 19:46:34 -0400 (0:00:00.507) 0:07:59.039 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Wednesday 18 March 2026 19:46:35 -0400 (0:00:00.396) 0:07:59.435 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Wednesday 18 March 2026 19:46:35 -0400 (0:00:00.495) 0:07:59.930 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Wednesday 18 March 2026 19:46:36 -0400 (0:00:00.546) 0:08:00.477 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Wednesday 18 March 2026 19:46:36 -0400 (0:00:00.300) 0:08:00.777 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Wednesday 18 March 2026 19:46:37 -0400 (0:00:00.404) 0:08:01.182 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Wednesday 18 March 2026 19:46:37 -0400 (0:00:00.302) 0:08:01.484 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Wednesday 18 March 2026 19:46:37 -0400 (0:00:00.326) 0:08:01.811 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Wednesday 18 March 2026 19:46:38 -0400 (0:00:00.394) 0:08:02.205 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Wednesday 18 March 2026 19:46:38 -0400 (0:00:00.343) 0:08:02.548 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Wednesday 18 March 2026 19:46:38 -0400 (0:00:00.343) 0:08:02.892 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Wednesday 18 March 2026 19:46:39 -0400 (0:00:00.300) 0:08:03.192 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Wednesday 18 March 2026 19:46:39 -0400 (0:00:00.436) 0:08:03.628 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Wednesday 18 March 2026 19:46:39 -0400 (0:00:00.366) 0:08:03.995 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Wednesday 18 March 2026 19:46:40 -0400 (0:00:00.503) 0:08:04.498 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Wednesday 18 March 2026 19:46:40 -0400 (0:00:00.341) 0:08:04.840 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Wednesday 18 March 2026 19:46:41 -0400 (0:00:00.363) 0:08:05.204 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Wednesday 18 March 2026 19:46:41 -0400 (0:00:00.396) 0:08:05.601 ******* ok: [managed-node4] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Wednesday 18 March 2026 19:46:41 -0400 (0:00:00.358) 0:08:05.959 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Wednesday 18 March 2026 19:46:42 -0400 (0:00:00.332) 0:08:06.291 ******* skipping: [managed-node4] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Wednesday 18 March 2026 19:46:42 -0400 (0:00:00.298) 0:08:06.590 ******* skipping: [managed-node4] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Wednesday 18 March 2026 19:46:42 -0400 (0:00:00.328) 0:08:06.918 ******* skipping: [managed-node4] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Wednesday 18 March 2026 19:46:43 -0400 (0:00:00.356) 0:08:07.275 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Wednesday 18 March 2026 19:46:43 -0400 (0:00:00.324) 0:08:07.599 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Wednesday 18 March 2026 19:46:43 -0400 (0:00:00.264) 0:08:07.864 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Wednesday 18 March 2026 19:46:44 -0400 (0:00:00.357) 0:08:08.221 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Wednesday 18 March 2026 19:46:44 -0400 (0:00:00.435) 0:08:08.657 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Wednesday 18 March 2026 19:46:44 -0400 (0:00:00.417) 0:08:09.074 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Wednesday 18 March 2026 19:46:45 -0400 (0:00:00.391) 0:08:09.466 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Wednesday 18 March 2026 19:46:45 -0400 (0:00:00.435) 0:08:09.901 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Wednesday 18 March 2026 19:46:46 -0400 (0:00:00.391) 0:08:10.292 ******* skipping: [managed-node4] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Wednesday 18 March 2026 19:46:46 -0400 (0:00:00.332) 0:08:10.625 ******* skipping: [managed-node4] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Wednesday 18 March 2026 19:46:46 -0400 (0:00:00.349) 0:08:10.975 ******* skipping: [managed-node4] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Wednesday 18 March 2026 19:46:47 -0400 (0:00:00.429) 0:08:11.404 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Wednesday 18 March 2026 19:46:47 -0400 (0:00:00.360) 0:08:11.764 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Wednesday 18 March 2026 19:46:47 -0400 (0:00:00.308) 0:08:12.072 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Wednesday 18 March 2026 19:46:48 -0400 (0:00:00.363) 0:08:12.436 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Wednesday 18 March 2026 19:46:48 -0400 (0:00:00.303) 0:08:12.740 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Wednesday 18 March 2026 19:46:49 -0400 (0:00:00.440) 0:08:13.180 ******* ok: [managed-node4] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Wednesday 18 March 2026 19:46:49 -0400 (0:00:00.310) 0:08:13.491 ******* ok: [managed-node4] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Wednesday 18 March 2026 19:46:49 -0400 (0:00:00.336) 0:08:13.827 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Wednesday 18 March 2026 19:46:50 -0400 (0:00:00.359) 0:08:14.187 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Wednesday 18 March 2026 19:46:50 -0400 (0:00:00.486) 0:08:14.674 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Wednesday 18 March 2026 19:46:50 -0400 (0:00:00.397) 0:08:15.071 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Wednesday 18 March 2026 19:46:51 -0400 (0:00:00.370) 0:08:15.442 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Wednesday 18 March 2026 19:46:51 -0400 (0:00:00.398) 0:08:15.840 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Wednesday 18 March 2026 19:46:52 -0400 (0:00:00.449) 0:08:16.290 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Wednesday 18 March 2026 19:46:52 -0400 (0:00:00.323) 0:08:16.614 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Wednesday 18 March 2026 19:46:52 -0400 (0:00:00.254) 0:08:16.868 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Wednesday 18 March 2026 19:46:53 -0400 (0:00:00.312) 0:08:17.181 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Test for correct handling of new encrypted volume w/ no key - 2] ********* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:193 Wednesday 18 March 2026 19:46:53 -0400 (0:00:00.359) 0:08:17.540 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node4 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Wednesday 18 March 2026 19:46:54 -0400 (0:00:00.627) 0:08:18.168 ******* ok: [managed-node4] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Wednesday 18 March 2026 19:46:54 -0400 (0:00:00.361) 0:08:18.529 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node4 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:24 Wednesday 18 March 2026 19:46:54 -0400 (0:00:00.440) 0:08:18.970 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:34 Wednesday 18 March 2026 19:46:55 -0400 (0:00:00.434) 0:08:19.404 ******* TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Wednesday 18 March 2026 19:46:55 -0400 (0:00:00.349) 0:08:19.754 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Wednesday 18 March 2026 19:46:56 -0400 (0:00:00.358) 0:08:20.113 ******* ok: [managed-node4] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Wednesday 18 March 2026 19:46:59 -0400 (0:00:03.088) 0:08:23.201 ******* skipping: [managed-node4] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node4] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node4] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-loop", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed-node4] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Wednesday 18 March 2026 19:46:59 -0400 (0:00:00.633) 0:08:23.835 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Wednesday 18 March 2026 19:47:00 -0400 (0:00:00.403) 0:08:24.239 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Wednesday 18 March 2026 19:47:00 -0400 (0:00:00.228) 0:08:24.468 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Wednesday 18 March 2026 19:47:00 -0400 (0:00:00.385) 0:08:24.853 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Wednesday 18 March 2026 19:47:01 -0400 (0:00:00.420) 0:08:25.274 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Wednesday 18 March 2026 19:47:01 -0400 (0:00:00.725) 0:08:26.000 ******* ok: [managed-node4] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-loop-2.18-5.el7.x86_64 providing libblockdev-loop is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Wednesday 18 March 2026 19:47:05 -0400 (0:00:03.812) 0:08:29.812 ******* ok: [managed-node4] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": true, "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Wednesday 18 March 2026 19:47:06 -0400 (0:00:00.411) 0:08:30.223 ******* ok: [managed-node4] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Wednesday 18 March 2026 19:47:06 -0400 (0:00:00.442) 0:08:30.665 ******* ok: [managed-node4] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Wednesday 18 March 2026 19:47:12 -0400 (0:00:06.010) 0:08:36.676 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Wednesday 18 March 2026 19:47:13 -0400 (0:00:00.433) 0:08:37.110 ******* TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Wednesday 18 March 2026 19:47:13 -0400 (0:00:00.245) 0:08:37.355 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Wednesday 18 March 2026 19:47:13 -0400 (0:00:00.248) 0:08:37.604 ******* TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Wednesday 18 March 2026 19:47:13 -0400 (0:00:00.129) 0:08:37.734 ******* ok: [managed-node4] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed", "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Wednesday 18 March 2026 19:47:17 -0400 (0:00:03.457) 0:08:41.191 ******* ok: [managed-node4] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Wednesday 18 March 2026 19:47:20 -0400 (0:00:03.039) 0:08:44.231 ******* TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Wednesday 18 March 2026 19:47:20 -0400 (0:00:00.569) 0:08:44.800 ******* fatal: [managed-node4]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: encrypted volume 'test1' missing key/password TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111 Wednesday 18 March 2026 19:47:26 -0400 (0:00:05.425) 0:08:50.226 ******* fatal: [managed-node4]: FAILED! => { "changed": false } MSG: {u'_ansible_no_log': False, u'crypts': [], u'pools': [], u'leaves': [], u'changed': False, u'actions': [], u'failed': True, u'volumes': [], u'invocation': {u'module_args': {u'packages_only': False, u'uses_kmod_kvdo': True, u'disklabel_type': None, u'safe_mode': False, u'diskvolume_mkfs_option_map': {u'ext4': u'-F', u'ext3': u'-F', u'ext2': u'-F'}, u'pools': [{u'raid_metadata_version': None, u'encryption_key_size': None, u'encryption_key': None, u'encryption_luks_version': None, u'encryption_tang_url': None, u'raid_spare_count': None, u'grow_to_fill': False, u'encryption_tang_thumbprint': None, u'name': u'foo', u'encryption_password': None, u'encryption': False, u'disks': [u'sda'], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'volumes': [{u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': u'4g', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': None, u'encryption': True, u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'part_type': None, u'mount_mode': None, u'thin_pool_name': None, u'type': u'partition', u'encryption_cipher': None, u'deduplication': None, u'vdo_pool_size': None, u'encryption_key_size': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'raid_stripe_size': None, u'mount_passno': 0, u'mount_user': None, u'raid_spare_count': None, u'name': u'test1', u'cache_mode': None, u'raid_disks': [], u'mount_group': None, u'fs_overwrite_existing': True, u'disks': [], u'cached': False, u'thin_pool_size': None, u'thin': False, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}], u'shared': False, u'encryption_clevis_pin': None, u'type': u'partition', u'encryption_cipher': None, u'raid_chunk_size': None}], u'volumes': [], u'pool_defaults': {u'raid_metadata_version': None, u'encryption_cipher': None, u'encryption_key': None, u'encryption_luks_version': None, u'raid_spare_count': None, u'grow_to_fill': False, u'encryption_password': None, u'encryption': False, u'disks': [], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'shared': False, u'type': u'lvm', u'encryption_key_size': None, u'raid_chunk_size': None}, u'volume_defaults': {u'raid_metadata_version': None, u'raid_level': None, u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': 0, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'encryption': False, u'mount_device_identifier': u'uuid', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'thin_pool_name': None, u'type': u'lvm', u'encryption_key_size': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'raid_stripe_size': None, u'cache_size': 0, u'raid_spare_count': None, u'cache_mode': None, u'deduplication': None, u'cached': False, u'fs_overwrite_existing': True, u'disks': [], u'thin_pool_size': None, u'thin': None, u'mount_check': 0, u'mount_passno': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}, u'use_partitions': None}}, u'mounts': [], u'packages': [], u'msg': u"encrypted volume 'test1' missing key/password"} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Wednesday 18 March 2026 19:47:26 -0400 (0:00:00.427) 0:08:50.653 ******* TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:22 Wednesday 18 March 2026 19:47:26 -0400 (0:00:00.337) 0:08:50.990 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:27 Wednesday 18 March 2026 19:47:27 -0400 (0:00:00.381) 0:08:51.372 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:38 Wednesday 18 March 2026 19:47:27 -0400 (0:00:00.570) 0:08:51.943 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Create an encrypted partition volume w/ default fs] ********************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:212 Wednesday 18 March 2026 19:47:28 -0400 (0:00:00.232) 0:08:52.176 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node4 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:24 Wednesday 18 March 2026 19:47:29 -0400 (0:00:01.162) 0:08:53.339 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:34 Wednesday 18 March 2026 19:47:29 -0400 (0:00:00.232) 0:08:53.571 ******* TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Wednesday 18 March 2026 19:47:29 -0400 (0:00:00.338) 0:08:53.910 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Wednesday 18 March 2026 19:47:30 -0400 (0:00:00.401) 0:08:54.311 ******* ok: [managed-node4] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Wednesday 18 March 2026 19:47:32 -0400 (0:00:02.669) 0:08:56.981 ******* skipping: [managed-node4] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node4] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node4] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-loop", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed-node4] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Wednesday 18 March 2026 19:47:33 -0400 (0:00:00.640) 0:08:57.621 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Wednesday 18 March 2026 19:47:33 -0400 (0:00:00.286) 0:08:57.908 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Wednesday 18 March 2026 19:47:34 -0400 (0:00:00.222) 0:08:58.131 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Wednesday 18 March 2026 19:47:34 -0400 (0:00:00.287) 0:08:58.419 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Wednesday 18 March 2026 19:47:34 -0400 (0:00:00.296) 0:08:58.716 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Wednesday 18 March 2026 19:47:35 -0400 (0:00:00.811) 0:08:59.527 ******* ok: [managed-node4] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-loop-2.18-5.el7.x86_64 providing libblockdev-loop is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Wednesday 18 March 2026 19:47:39 -0400 (0:00:04.119) 0:09:03.647 ******* ok: [managed-node4] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Wednesday 18 March 2026 19:47:39 -0400 (0:00:00.347) 0:09:03.995 ******* ok: [managed-node4] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Wednesday 18 March 2026 19:47:40 -0400 (0:00:00.353) 0:09:04.348 ******* ok: [managed-node4] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Wednesday 18 March 2026 19:47:45 -0400 (0:00:05.729) 0:09:10.077 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Wednesday 18 March 2026 19:47:46 -0400 (0:00:00.345) 0:09:10.423 ******* TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Wednesday 18 March 2026 19:47:46 -0400 (0:00:00.120) 0:09:10.544 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Wednesday 18 March 2026 19:47:46 -0400 (0:00:00.284) 0:09:10.829 ******* TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Wednesday 18 March 2026 19:47:46 -0400 (0:00:00.156) 0:09:10.985 ******* ok: [managed-node4] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed", "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Wednesday 18 March 2026 19:47:49 -0400 (0:00:03.007) 0:09:13.993 ******* ok: [managed-node4] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Wednesday 18 March 2026 19:47:53 -0400 (0:00:03.126) 0:09:17.119 ******* TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Wednesday 18 March 2026 19:47:53 -0400 (0:00:00.360) 0:09:17.479 ******* changed: [managed-node4] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-245394af-792c-404d-9215-db78c843f58b", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-245394af-792c-404d-9215-db78c843f58b", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sda1", "fs_type": null }, { "action": "create format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-899c6988-aa8f-4f93-9423-a0a1748937ea", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-899c6988-aa8f-4f93-9423-a0a1748937ea", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-245394af-792c-404d-9215-db78c843f58b", "password": "-", "state": "absent" }, { "backing_device": "/dev/sda1", "name": "luks-899c6988-aa8f-4f93-9423-a0a1748937ea", "password": "-", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-899c6988-aa8f-4f93-9423-a0a1748937ea" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-245394af-792c-404d-9215-db78c843f58b", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-899c6988-aa8f-4f93-9423-a0a1748937ea", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "e2fsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-899c6988-aa8f-4f93-9423-a0a1748937ea", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-899c6988-aa8f-4f93-9423-a0a1748937ea", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 Wednesday 18 March 2026 19:48:06 -0400 (0:00:12.891) 0:09:30.371 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Wednesday 18 March 2026 19:48:06 -0400 (0:00:00.301) 0:09:30.673 ******* ok: [managed-node4] => { "changed": false, "stat": { "atime": 1773877553.602368, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "d73baed071123922c6a088c4cd6befdaf948cdc7", "ctime": 1773877553.6003683, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263969, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1773877553.6003683, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1311, "uid": 0, "version": "18446744071680134064", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:97 Wednesday 18 March 2026 19:48:08 -0400 (0:00:01.781) 0:09:32.454 ******* ok: [managed-node4] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Wednesday 18 March 2026 19:48:10 -0400 (0:00:01.820) 0:09:34.275 ******* TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Wednesday 18 March 2026 19:48:10 -0400 (0:00:00.447) 0:09:34.723 ******* ok: [managed-node4] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-245394af-792c-404d-9215-db78c843f58b", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-245394af-792c-404d-9215-db78c843f58b", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sda1", "fs_type": null }, { "action": "create format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-899c6988-aa8f-4f93-9423-a0a1748937ea", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-899c6988-aa8f-4f93-9423-a0a1748937ea", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-245394af-792c-404d-9215-db78c843f58b", "password": "-", "state": "absent" }, { "backing_device": "/dev/sda1", "name": "luks-899c6988-aa8f-4f93-9423-a0a1748937ea", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-899c6988-aa8f-4f93-9423-a0a1748937ea" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-245394af-792c-404d-9215-db78c843f58b", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-899c6988-aa8f-4f93-9423-a0a1748937ea", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "e2fsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-899c6988-aa8f-4f93-9423-a0a1748937ea", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-899c6988-aa8f-4f93-9423-a0a1748937ea", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:130 Wednesday 18 March 2026 19:48:11 -0400 (0:00:00.498) 0:09:35.221 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-899c6988-aa8f-4f93-9423-a0a1748937ea", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-899c6988-aa8f-4f93-9423-a0a1748937ea", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:134 Wednesday 18 March 2026 19:48:11 -0400 (0:00:00.405) 0:09:35.626 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:150 Wednesday 18 March 2026 19:48:11 -0400 (0:00:00.268) 0:09:35.895 ******* changed: [managed-node4] => (item={u'src': u'/dev/mapper/luks-245394af-792c-404d-9215-db78c843f58b', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-245394af-792c-404d-9215-db78c843f58b", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-245394af-792c-404d-9215-db78c843f58b" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Wednesday 18 March 2026 19:48:13 -0400 (0:00:02.062) 0:09:37.957 ******* ok: [managed-node4] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166 Wednesday 18 March 2026 19:48:15 -0400 (0:00:01.981) 0:09:39.939 ******* changed: [managed-node4] => (item={u'src': u'/dev/mapper/luks-899c6988-aa8f-4f93-9423-a0a1748937ea', u'group': None, u'dump': 0, u'passno': 0, u'fstype': u'xfs', u'state': u'mounted', u'mode': None, u'owner': None, u'path': u'/opt/test1', u'opts': u'defaults'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-899c6988-aa8f-4f93-9423-a0a1748937ea", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-899c6988-aa8f-4f93-9423-a0a1748937ea" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:177 Wednesday 18 March 2026 19:48:17 -0400 (0:00:01.857) 0:09:41.796 ******* skipping: [managed-node4] => (item={u'src': u'/dev/mapper/luks-899c6988-aa8f-4f93-9423-a0a1748937ea', u'group': None, u'dump': 0, u'passno': 0, u'fstype': u'xfs', u'state': u'mounted', u'mode': None, u'owner': None, u'path': u'/opt/test1', u'opts': u'defaults'}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-899c6988-aa8f-4f93-9423-a0a1748937ea", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:189 Wednesday 18 March 2026 19:48:17 -0400 (0:00:00.278) 0:09:42.074 ******* ok: [managed-node4] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 Wednesday 18 March 2026 19:48:20 -0400 (0:00:02.206) 0:09:44.280 ******* ok: [managed-node4] => { "changed": false, "stat": { "atime": 1773877568.9744122, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "a94635f18ad9719ab369c6c6976f63dd6c7b1b9a", "ctime": 1773877559.7793858, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 264032, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1773877559.7783859, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 53, "uid": 0, "version": "18446744071680135551", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:202 Wednesday 18 March 2026 19:48:21 -0400 (0:00:01.764) 0:09:46.045 ******* changed: [managed-node4] => (item={u'state': u'absent', u'password': u'-', u'name': u'luks-245394af-792c-404d-9215-db78c843f58b', u'backing_device': u'/dev/sda'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda", "name": "luks-245394af-792c-404d-9215-db78c843f58b", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed changed: [managed-node4] => (item={u'state': u'present', u'password': u'-', u'name': u'luks-899c6988-aa8f-4f93-9423-a0a1748937ea', u'backing_device': u'/dev/sda1'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda1", "name": "luks-899c6988-aa8f-4f93-9423-a0a1748937ea", "password": "-", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 Wednesday 18 March 2026 19:48:25 -0400 (0:00:03.423) 0:09:49.469 ******* ok: [managed-node4] TASK [Verify role results - 4] ************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:228 Wednesday 18 March 2026 19:48:28 -0400 (0:00:03.333) 0:09:52.802 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node4 TASK [Print out pool information] ********************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Wednesday 18 March 2026 19:48:29 -0400 (0:00:01.134) 0:09:53.937 ******* ok: [managed-node4] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-899c6988-aa8f-4f93-9423-a0a1748937ea", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-899c6988-aa8f-4f93-9423-a0a1748937ea", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Wednesday 18 March 2026 19:48:30 -0400 (0:00:00.393) 0:09:54.330 ******* skipping: [managed-node4] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Wednesday 18 March 2026 19:48:30 -0400 (0:00:00.357) 0:09:54.688 ******* ok: [managed-node4] => { "changed": false, "info": { "/dev/mapper/luks-899c6988-aa8f-4f93-9423-a0a1748937ea": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-899c6988-aa8f-4f93-9423-a0a1748937ea", "size": "4G", "type": "crypt", "uuid": "eba03f3c-e05e-405c-afcc-84b92fdb1f58" }, "/dev/sda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/sda1", "size": "4G", "type": "partition", "uuid": "899c6988-aa8f-4f93-9423-a0a1748937ea" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "ext4", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "c7b7d6a5-fd01-4b9b-bcca-153eaff9d312" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Wednesday 18 March 2026 19:48:32 -0400 (0:00:01.839) 0:09:56.527 ******* ok: [managed-node4] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002965", "end": "2026-03-18 19:48:34.061381", "rc": 0, "start": "2026-03-18 19:48:34.058416" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Thu Jun 20 10:23:46 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk' # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info # UUID=c7b7d6a5-fd01-4b9b-bcca-153eaff9d312 / ext4 defaults 1 1 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-899c6988-aa8f-4f93-9423-a0a1748937ea /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Wednesday 18 March 2026 19:48:34 -0400 (0:00:01.942) 0:09:58.469 ******* ok: [managed-node4] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002514", "end": "2026-03-18 19:48:35.932723", "failed_when_result": false, "rc": 0, "start": "2026-03-18 19:48:35.930209" } STDOUT: luks-899c6988-aa8f-4f93-9423-a0a1748937ea /dev/sda1 - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Wednesday 18 March 2026 19:48:36 -0400 (0:00:01.942) 0:10:00.412 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node4 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Wednesday 18 March 2026 19:48:37 -0400 (0:00:00.796) 0:10:01.208 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Wednesday 18 March 2026 19:48:37 -0400 (0:00:00.352) 0:10:01.560 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Wednesday 18 March 2026 19:48:37 -0400 (0:00:00.330) 0:10:01.891 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Wednesday 18 March 2026 19:48:38 -0400 (0:00:00.457) 0:10:02.349 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node4 included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node4 TASK [Set test variables] ****************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Wednesday 18 March 2026 19:48:39 -0400 (0:00:01.015) 0:10:03.365 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Wednesday 18 March 2026 19:48:39 -0400 (0:00:00.388) 0:10:03.753 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Wednesday 18 March 2026 19:48:39 -0400 (0:00:00.298) 0:10:04.051 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Wednesday 18 March 2026 19:48:40 -0400 (0:00:00.346) 0:10:04.398 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Wednesday 18 March 2026 19:48:40 -0400 (0:00:00.405) 0:10:04.804 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Wednesday 18 March 2026 19:48:41 -0400 (0:00:00.328) 0:10:05.132 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type - 2] ************************************************ task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Wednesday 18 March 2026 19:48:41 -0400 (0:00:00.371) 0:10:05.504 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type - 3] ************************************************ task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Wednesday 18 March 2026 19:48:41 -0400 (0:00:00.345) 0:10:05.849 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:55 Wednesday 18 March 2026 19:48:42 -0400 (0:00:00.340) 0:10:06.190 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:68 Wednesday 18 March 2026 19:48:42 -0400 (0:00:00.404) 0:10:06.595 ******* ok: [managed-node4] => { "changed": false, "failed_when_result": false, "rc": 1 } STDERR: Shared connection to 10.31.44.104 closed. MSG: non-zero return code TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:78 Wednesday 18 March 2026 19:48:44 -0400 (0:00:01.765) 0:10:08.361 ******* TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:88 Wednesday 18 March 2026 19:48:44 -0400 (0:00:00.241) 0:10:08.603 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node4 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Wednesday 18 March 2026 19:48:45 -0400 (0:00:00.794) 0:10:09.398 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Wednesday 18 March 2026 19:48:45 -0400 (0:00:00.296) 0:10:09.694 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Wednesday 18 March 2026 19:48:45 -0400 (0:00:00.395) 0:10:10.089 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Wednesday 18 March 2026 19:48:46 -0400 (0:00:00.328) 0:10:10.417 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Wednesday 18 March 2026 19:48:46 -0400 (0:00:00.318) 0:10:10.736 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Wednesday 18 March 2026 19:48:46 -0400 (0:00:00.217) 0:10:10.954 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Wednesday 18 March 2026 19:48:47 -0400 (0:00:00.354) 0:10:11.309 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Wednesday 18 March 2026 19:48:47 -0400 (0:00:00.387) 0:10:11.696 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Wednesday 18 March 2026 19:48:47 -0400 (0:00:00.339) 0:10:12.035 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Wednesday 18 March 2026 19:48:48 -0400 (0:00:00.424) 0:10:12.460 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Wednesday 18 March 2026 19:48:48 -0400 (0:00:00.446) 0:10:12.907 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:91 Wednesday 18 March 2026 19:48:49 -0400 (0:00:00.357) 0:10:13.264 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node4 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Wednesday 18 March 2026 19:48:49 -0400 (0:00:00.826) 0:10:14.091 ******* skipping: [managed-node4] => (item={u'_raw_device': u'/dev/sda1', u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'xfs', u'mount_options': u'defaults', u'_device': u'/dev/mapper/luks-899c6988-aa8f-4f93-9423-a0a1748937ea', u'size': u'4g', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': u'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', u'_kernel_device': u'/dev/dm-0', u'encryption': True, u'raid_level': None, u'name': u'test1', u'raid_device_count': None, u'state': u'present', u'part_type': None, u'mount_mode': None, u'thin_pool_name': None, u'type': u'partition', u'encryption_cipher': None, u'vdo_pool_size': None, u'encryption_key_size': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'raid_stripe_size': None, u'mount_passno': 0, u'_mount_id': u'/dev/mapper/luks-899c6988-aa8f-4f93-9423-a0a1748937ea', u'mount_user': None, u'raid_spare_count': None, u'raid_disks': [], u'_raw_kernel_device': u'/dev/sda1', u'cache_mode': None, u'cache_devices': [], u'deduplication': None, u'mount_group': None, u'thin_pool_size': None, u'disks': [], u'cached': False, u'thin': False, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'fs_overwrite_existing': True, u'fs_create_options': u''}) => { "ansible_loop_var": "storage_test_lvmraid_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_lvmraid_volume": { "_device": "/dev/mapper/luks-899c6988-aa8f-4f93-9423-a0a1748937ea", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-899c6988-aa8f-4f93-9423-a0a1748937ea", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:94 Wednesday 18 March 2026 19:48:50 -0400 (0:00:00.522) 0:10:14.613 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node4 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Wednesday 18 March 2026 19:48:51 -0400 (0:00:00.796) 0:10:15.410 ******* skipping: [managed-node4] => (item={u'_raw_device': u'/dev/sda1', u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'xfs', u'mount_options': u'defaults', u'_device': u'/dev/mapper/luks-899c6988-aa8f-4f93-9423-a0a1748937ea', u'size': u'4g', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': u'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', u'_kernel_device': u'/dev/dm-0', u'encryption': True, u'raid_level': None, u'name': u'test1', u'raid_device_count': None, u'state': u'present', u'part_type': None, u'mount_mode': None, u'thin_pool_name': None, u'type': u'partition', u'encryption_cipher': None, u'vdo_pool_size': None, u'encryption_key_size': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'raid_stripe_size': None, u'mount_passno': 0, u'_mount_id': u'/dev/mapper/luks-899c6988-aa8f-4f93-9423-a0a1748937ea', u'mount_user': None, u'raid_spare_count': None, u'raid_disks': [], u'_raw_kernel_device': u'/dev/sda1', u'cache_mode': None, u'cache_devices': [], u'deduplication': None, u'mount_group': None, u'thin_pool_size': None, u'disks': [], u'cached': False, u'thin': False, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'fs_overwrite_existing': True, u'fs_create_options': u''}) => { "ansible_loop_var": "storage_test_thin_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_thin_volume": { "_device": "/dev/mapper/luks-899c6988-aa8f-4f93-9423-a0a1748937ea", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-899c6988-aa8f-4f93-9423-a0a1748937ea", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check member encryption] ************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:97 Wednesday 18 March 2026 19:48:51 -0400 (0:00:00.636) 0:10:16.046 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node4 TASK [Set test variables] ****************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Wednesday 18 March 2026 19:48:52 -0400 (0:00:00.934) 0:10:16.981 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Wednesday 18 March 2026 19:48:53 -0400 (0:00:00.581) 0:10:17.562 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Wednesday 18 March 2026 19:48:53 -0400 (0:00:00.358) 0:10:17.920 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Wednesday 18 March 2026 19:48:54 -0400 (0:00:00.316) 0:10:18.236 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:100 Wednesday 18 March 2026 19:48:54 -0400 (0:00:00.308) 0:10:18.545 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node4 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Wednesday 18 March 2026 19:48:55 -0400 (0:00:00.900) 0:10:19.446 ******* skipping: [managed-node4] => (item={u'_raw_device': u'/dev/sda1', u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'xfs', u'mount_options': u'defaults', u'_device': u'/dev/mapper/luks-899c6988-aa8f-4f93-9423-a0a1748937ea', u'size': u'4g', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': u'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', u'_kernel_device': u'/dev/dm-0', u'encryption': True, u'raid_level': None, u'name': u'test1', u'raid_device_count': None, u'state': u'present', u'part_type': None, u'mount_mode': None, u'thin_pool_name': None, u'type': u'partition', u'encryption_cipher': None, u'vdo_pool_size': None, u'encryption_key_size': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'raid_stripe_size': None, u'mount_passno': 0, u'_mount_id': u'/dev/mapper/luks-899c6988-aa8f-4f93-9423-a0a1748937ea', u'mount_user': None, u'raid_spare_count': None, u'raid_disks': [], u'_raw_kernel_device': u'/dev/sda1', u'cache_mode': None, u'cache_devices': [], u'deduplication': None, u'mount_group': None, u'thin_pool_size': None, u'disks': [], u'cached': False, u'thin': False, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'fs_overwrite_existing': True, u'fs_create_options': u''}) => { "ansible_loop_var": "storage_test_vdo_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_vdo_volume": { "_device": "/dev/mapper/luks-899c6988-aa8f-4f93-9423-a0a1748937ea", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-899c6988-aa8f-4f93-9423-a0a1748937ea", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:103 Wednesday 18 March 2026 19:48:55 -0400 (0:00:00.403) 0:10:19.850 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node4 TASK [Get stratis pool information] ******************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Wednesday 18 March 2026 19:48:56 -0400 (0:00:00.912) 0:10:20.762 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print script output] ***************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Wednesday 18 March 2026 19:48:57 -0400 (0:00:00.378) 0:10:21.141 ******* skipping: [managed-node4] => {} TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:19 Wednesday 18 March 2026 19:48:58 -0400 (0:00:01.141) 0:10:22.283 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:23 Wednesday 18 March 2026 19:48:58 -0400 (0:00:00.316) 0:10:22.600 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:30 Wednesday 18 March 2026 19:48:58 -0400 (0:00:00.422) 0:10:23.022 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:39 Wednesday 18 March 2026 19:48:59 -0400 (0:00:00.366) 0:10:23.388 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:49 Wednesday 18 March 2026 19:48:59 -0400 (0:00:00.293) 0:10:23.682 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:106 Wednesday 18 March 2026 19:48:59 -0400 (0:00:00.360) 0:10:24.043 ******* ok: [managed-node4] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Wednesday 18 March 2026 19:49:00 -0400 (0:00:00.395) 0:10:24.438 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node4 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Wednesday 18 March 2026 19:49:01 -0400 (0:00:00.712) 0:10:25.150 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Wednesday 18 March 2026 19:49:01 -0400 (0:00:00.344) 0:10:25.494 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node4 included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node4 included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node4 included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node4 included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node4 included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node4 included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node4 included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node4 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Wednesday 18 March 2026 19:49:03 -0400 (0:00:02.380) 0:10:27.875 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-899c6988-aa8f-4f93-9423-a0a1748937ea" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Wednesday 18 March 2026 19:49:04 -0400 (0:00:00.308) 0:10:28.183 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Wednesday 18 March 2026 19:49:04 -0400 (0:00:00.580) 0:10:28.764 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Wednesday 18 March 2026 19:49:04 -0400 (0:00:00.269) 0:10:29.034 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Wednesday 18 March 2026 19:49:05 -0400 (0:00:00.446) 0:10:29.480 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Wednesday 18 March 2026 19:49:05 -0400 (0:00:00.506) 0:10:29.986 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Wednesday 18 March 2026 19:49:06 -0400 (0:00:00.307) 0:10:30.294 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Wednesday 18 March 2026 19:49:06 -0400 (0:00:00.376) 0:10:30.671 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Wednesday 18 March 2026 19:49:07 -0400 (0:00:00.439) 0:10:31.110 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Wednesday 18 March 2026 19:49:07 -0400 (0:00:00.364) 0:10:31.474 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Wednesday 18 March 2026 19:49:07 -0400 (0:00:00.368) 0:10:31.843 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Wednesday 18 March 2026 19:49:08 -0400 (0:00:00.409) 0:10:32.252 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-899c6988-aa8f-4f93-9423-a0a1748937ea " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Wednesday 18 March 2026 19:49:08 -0400 (0:00:00.646) 0:10:32.898 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Wednesday 18 March 2026 19:49:09 -0400 (0:00:00.471) 0:10:33.370 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Wednesday 18 March 2026 19:49:09 -0400 (0:00:00.470) 0:10:33.840 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Wednesday 18 March 2026 19:49:10 -0400 (0:00:00.350) 0:10:34.191 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Wednesday 18 March 2026 19:49:10 -0400 (0:00:00.365) 0:10:34.556 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Wednesday 18 March 2026 19:49:10 -0400 (0:00:00.244) 0:10:34.801 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Wednesday 18 March 2026 19:49:11 -0400 (0:00:00.397) 0:10:35.199 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Wednesday 18 March 2026 19:49:11 -0400 (0:00:00.498) 0:10:35.697 ******* ok: [managed-node4] => { "changed": false, "stat": { "atime": 1773877685.6647463, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1773877685.6647463, "dev": 5, "device_type": 2049, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 134797, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1773877685.6647463, "nlink": 1, "path": "/dev/sda1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Wednesday 18 March 2026 19:49:13 -0400 (0:00:01.918) 0:10:37.616 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Wednesday 18 March 2026 19:49:13 -0400 (0:00:00.337) 0:10:37.953 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Wednesday 18 March 2026 19:49:14 -0400 (0:00:00.311) 0:10:38.265 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Wednesday 18 March 2026 19:49:14 -0400 (0:00:00.478) 0:10:38.744 ******* ok: [managed-node4] => { "ansible_facts": { "st_volume_type": "partition" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Wednesday 18 March 2026 19:49:15 -0400 (0:00:00.384) 0:10:39.129 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Wednesday 18 March 2026 19:49:15 -0400 (0:00:00.425) 0:10:39.554 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Wednesday 18 March 2026 19:49:15 -0400 (0:00:00.521) 0:10:40.076 ******* ok: [managed-node4] => { "changed": false, "stat": { "atime": 1773877685.7737465, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1773877685.7737465, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 134847, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1773877685.7737465, "nlink": 1, "path": "/dev/mapper/luks-899c6988-aa8f-4f93-9423-a0a1748937ea", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Wednesday 18 March 2026 19:49:18 -0400 (0:00:02.049) 0:10:42.125 ******* ok: [managed-node4] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed" ] } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Wednesday 18 March 2026 19:49:21 -0400 (0:00:03.230) 0:10:45.355 ******* ok: [managed-node4] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/sda1" ], "delta": "0:00:00.024565", "end": "2026-03-18 19:49:22.968111", "rc": 0, "start": "2026-03-18 19:49:22.943546" } STDOUT: LUKS header information for /dev/sda1 Version: 1 Cipher name: aes Cipher mode: xts-plain64 Hash spec: sha256 Payload offset: 8192 MK bits: 512 MK digest: d0 18 71 9a 46 70 f0 9b 0d 3f 76 54 8b ce bd 22 c4 f8 63 4e MK salt: 64 d8 1d 00 ee 49 4c f4 54 f1 c0 c7 87 30 b1 b4 4a 81 0e 3d 30 14 d6 49 13 b6 61 23 4f 80 85 85 MK iterations: 23141 UUID: 899c6988-aa8f-4f93-9423-a0a1748937ea Key Slot 0: ENABLED Iterations: 371308 Salt: d8 f8 ce 0c 85 93 ea cf 7d a3 b6 ad 18 5a 86 e4 2f a3 c6 73 f0 ff ac 37 f1 a7 90 c3 c3 3e 44 19 Key material offset: 8 AF stripes: 4000 Key Slot 1: DISABLED Key Slot 2: DISABLED Key Slot 3: DISABLED Key Slot 4: DISABLED Key Slot 5: DISABLED Key Slot 6: DISABLED Key Slot 7: DISABLED TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Wednesday 18 March 2026 19:49:23 -0400 (0:00:02.084) 0:10:47.440 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Wednesday 18 March 2026 19:49:23 -0400 (0:00:00.429) 0:10:47.870 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Wednesday 18 March 2026 19:49:24 -0400 (0:00:00.422) 0:10:48.292 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Wednesday 18 March 2026 19:49:24 -0400 (0:00:00.412) 0:10:48.705 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Wednesday 18 March 2026 19:49:25 -0400 (0:00:00.423) 0:10:49.128 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Wednesday 18 March 2026 19:49:25 -0400 (0:00:00.326) 0:10:49.454 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Wednesday 18 March 2026 19:49:25 -0400 (0:00:00.337) 0:10:49.792 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Wednesday 18 March 2026 19:49:26 -0400 (0:00:00.327) 0:10:50.119 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-899c6988-aa8f-4f93-9423-a0a1748937ea /dev/sda1 -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Wednesday 18 March 2026 19:49:26 -0400 (0:00:00.451) 0:10:50.571 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Wednesday 18 March 2026 19:49:26 -0400 (0:00:00.374) 0:10:50.946 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Wednesday 18 March 2026 19:49:27 -0400 (0:00:00.466) 0:10:51.412 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Wednesday 18 March 2026 19:49:27 -0400 (0:00:00.357) 0:10:51.769 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Wednesday 18 March 2026 19:49:28 -0400 (0:00:00.385) 0:10:52.154 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Wednesday 18 March 2026 19:49:28 -0400 (0:00:00.286) 0:10:52.441 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Wednesday 18 March 2026 19:49:28 -0400 (0:00:00.384) 0:10:52.825 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Wednesday 18 March 2026 19:49:29 -0400 (0:00:00.309) 0:10:53.135 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Wednesday 18 March 2026 19:49:29 -0400 (0:00:00.373) 0:10:53.508 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Wednesday 18 March 2026 19:49:29 -0400 (0:00:00.362) 0:10:53.870 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Wednesday 18 March 2026 19:49:30 -0400 (0:00:00.291) 0:10:54.162 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Wednesday 18 March 2026 19:49:30 -0400 (0:00:00.298) 0:10:54.460 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Wednesday 18 March 2026 19:49:30 -0400 (0:00:00.392) 0:10:54.853 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Wednesday 18 March 2026 19:49:31 -0400 (0:00:00.305) 0:10:55.158 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Wednesday 18 March 2026 19:49:31 -0400 (0:00:00.389) 0:10:55.548 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Wednesday 18 March 2026 19:49:31 -0400 (0:00:00.327) 0:10:55.876 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Wednesday 18 March 2026 19:49:32 -0400 (0:00:00.305) 0:10:56.181 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Wednesday 18 March 2026 19:49:32 -0400 (0:00:00.321) 0:10:56.504 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Wednesday 18 March 2026 19:49:32 -0400 (0:00:00.342) 0:10:56.846 ******* ok: [managed-node4] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Wednesday 18 March 2026 19:49:33 -0400 (0:00:00.397) 0:10:57.243 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Wednesday 18 March 2026 19:49:33 -0400 (0:00:00.384) 0:10:57.628 ******* skipping: [managed-node4] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Wednesday 18 March 2026 19:49:33 -0400 (0:00:00.431) 0:10:58.060 ******* skipping: [managed-node4] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Wednesday 18 March 2026 19:49:34 -0400 (0:00:00.399) 0:10:58.460 ******* skipping: [managed-node4] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Wednesday 18 March 2026 19:49:34 -0400 (0:00:00.482) 0:10:58.942 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Wednesday 18 March 2026 19:49:35 -0400 (0:00:00.350) 0:10:59.293 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Wednesday 18 March 2026 19:49:35 -0400 (0:00:00.294) 0:10:59.587 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Wednesday 18 March 2026 19:49:35 -0400 (0:00:00.391) 0:10:59.979 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Wednesday 18 March 2026 19:49:36 -0400 (0:00:00.409) 0:11:00.388 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Wednesday 18 March 2026 19:49:36 -0400 (0:00:00.404) 0:11:00.792 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Wednesday 18 March 2026 19:49:37 -0400 (0:00:00.303) 0:11:01.096 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Wednesday 18 March 2026 19:49:37 -0400 (0:00:00.417) 0:11:01.513 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Wednesday 18 March 2026 19:49:37 -0400 (0:00:00.442) 0:11:01.956 ******* skipping: [managed-node4] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Wednesday 18 March 2026 19:49:38 -0400 (0:00:00.325) 0:11:02.281 ******* skipping: [managed-node4] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Wednesday 18 March 2026 19:49:38 -0400 (0:00:00.311) 0:11:02.593 ******* skipping: [managed-node4] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Wednesday 18 March 2026 19:49:38 -0400 (0:00:00.347) 0:11:02.941 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Wednesday 18 March 2026 19:49:39 -0400 (0:00:00.410) 0:11:03.352 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Wednesday 18 March 2026 19:49:39 -0400 (0:00:00.440) 0:11:03.792 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Wednesday 18 March 2026 19:49:40 -0400 (0:00:00.404) 0:11:04.197 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Wednesday 18 March 2026 19:49:40 -0400 (0:00:00.328) 0:11:04.526 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Wednesday 18 March 2026 19:49:40 -0400 (0:00:00.375) 0:11:04.902 ******* ok: [managed-node4] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Wednesday 18 March 2026 19:49:41 -0400 (0:00:00.364) 0:11:05.266 ******* ok: [managed-node4] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Wednesday 18 March 2026 19:49:41 -0400 (0:00:00.383) 0:11:05.650 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Wednesday 18 March 2026 19:49:41 -0400 (0:00:00.360) 0:11:06.010 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Wednesday 18 March 2026 19:49:42 -0400 (0:00:00.217) 0:11:06.228 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Wednesday 18 March 2026 19:49:42 -0400 (0:00:00.384) 0:11:06.613 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Wednesday 18 March 2026 19:49:42 -0400 (0:00:00.328) 0:11:06.941 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Wednesday 18 March 2026 19:49:43 -0400 (0:00:00.371) 0:11:07.313 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Wednesday 18 March 2026 19:49:43 -0400 (0:00:00.236) 0:11:07.550 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Wednesday 18 March 2026 19:49:43 -0400 (0:00:00.264) 0:11:07.814 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Wednesday 18 March 2026 19:49:44 -0400 (0:00:00.365) 0:11:08.180 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Wednesday 18 March 2026 19:49:44 -0400 (0:00:00.307) 0:11:08.487 ******* TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Wednesday 18 March 2026 19:49:44 -0400 (0:00:00.290) 0:11:08.777 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Wednesday 18 March 2026 19:49:44 -0400 (0:00:00.273) 0:11:09.051 ******* changed: [managed-node4] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode - 3] ****************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:234 Wednesday 18 March 2026 19:49:47 -0400 (0:00:02.238) 0:11:11.289 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node4 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Wednesday 18 March 2026 19:49:48 -0400 (0:00:01.068) 0:11:12.358 ******* ok: [managed-node4] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Wednesday 18 March 2026 19:49:48 -0400 (0:00:00.442) 0:11:12.801 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node4 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:24 Wednesday 18 March 2026 19:49:49 -0400 (0:00:00.400) 0:11:13.201 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:34 Wednesday 18 March 2026 19:49:49 -0400 (0:00:00.172) 0:11:13.374 ******* TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Wednesday 18 March 2026 19:49:49 -0400 (0:00:00.468) 0:11:13.843 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Wednesday 18 March 2026 19:49:50 -0400 (0:00:00.297) 0:11:14.140 ******* ok: [managed-node4] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Wednesday 18 March 2026 19:49:53 -0400 (0:00:02.960) 0:11:17.101 ******* skipping: [managed-node4] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node4] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node4] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-loop", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed-node4] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Wednesday 18 March 2026 19:49:53 -0400 (0:00:00.722) 0:11:17.824 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Wednesday 18 March 2026 19:49:54 -0400 (0:00:00.399) 0:11:18.223 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Wednesday 18 March 2026 19:49:54 -0400 (0:00:00.294) 0:11:18.518 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Wednesday 18 March 2026 19:49:54 -0400 (0:00:00.262) 0:11:18.780 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Wednesday 18 March 2026 19:49:55 -0400 (0:00:00.319) 0:11:19.100 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Wednesday 18 March 2026 19:49:55 -0400 (0:00:00.955) 0:11:20.055 ******* ok: [managed-node4] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-loop-2.18-5.el7.x86_64 providing libblockdev-loop is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Wednesday 18 March 2026 19:49:59 -0400 (0:00:03.817) 0:11:23.873 ******* ok: [managed-node4] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": false, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Wednesday 18 March 2026 19:50:00 -0400 (0:00:00.450) 0:11:24.323 ******* ok: [managed-node4] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Wednesday 18 March 2026 19:50:00 -0400 (0:00:00.427) 0:11:24.751 ******* ok: [managed-node4] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Wednesday 18 March 2026 19:50:06 -0400 (0:00:05.998) 0:11:30.749 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Wednesday 18 March 2026 19:50:07 -0400 (0:00:00.500) 0:11:31.250 ******* TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Wednesday 18 March 2026 19:50:07 -0400 (0:00:00.307) 0:11:31.558 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Wednesday 18 March 2026 19:50:07 -0400 (0:00:00.470) 0:11:32.028 ******* TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Wednesday 18 March 2026 19:50:08 -0400 (0:00:00.337) 0:11:32.365 ******* ok: [managed-node4] => { "changed": false, "rc": 0, "results": [ "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Wednesday 18 March 2026 19:50:11 -0400 (0:00:03.602) 0:11:35.968 ******* ok: [managed-node4] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-cryptsetup@luks\\x2d245394af\\x2d792c\\x2d404d\\x2d9215\\x2ddb78c843f58b.service": { "name": "systemd-cryptsetup@luks\\x2d245394af\\x2d792c\\x2d404d\\x2d9215\\x2ddb78c843f58b.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Wednesday 18 March 2026 19:50:14 -0400 (0:00:02.875) 0:11:38.844 ******* changed: [managed-node4] => (item=systemd-cryptsetup@luks\x2d245394af\x2d792c\x2d404d\x2d9215\x2ddb78c843f58b.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d245394af\\x2d792c\\x2d404d\\x2d9215\\x2ddb78c843f58b.service", "name": "systemd-cryptsetup@luks\\x2d245394af\\x2d792c\\x2d404d\\x2d9215\\x2ddb78c843f58b.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "cryptsetup-pre.target systemd-readahead-replay.service systemd-journald.socket systemd-readahead-collect.service system-systemd\\x2dcryptsetup.slice dev-sda.device", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "umount.target cryptsetup.target", "BindsTo": "dev-sda.device", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "Delegate": "no", "Description": "Cryptography Setup for luks-245394af-792c-404d-9215-db78c843f58b", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-245394af-792c-404d-9215-db78c843f58b /dev/sda - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-245394af-792c-404d-9215-db78c843f58b ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d245394af\\x2d792c\\x2d404d\\x2d9215\\x2ddb78c843f58b.service", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2d245394af\\x2d792c\\x2d404d\\x2d9215\\x2ddb78c843f58b.service", "IgnoreOnIsolate": "yes", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "4096", "LimitNPROC": "14311", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "14311", "LimitSTACK": "18446744073709551615", "LoadState": "loaded", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2d245394af\\x2d792c\\x2d404d\\x2d9215\\x2ddb78c843f58b.service", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RequiredBy": "cryptsetup.target", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "0", "TimeoutStopUSec": "0", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "bad", "WantedBy": "dev-sda.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Wednesday 18 March 2026 19:50:17 -0400 (0:00:02.572) 0:11:41.416 ******* fatal: [managed-node4]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'luks-899c6988-aa8f-4f93-9423-a0a1748937ea' in safe mode due to encryption removal TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111 Wednesday 18 March 2026 19:50:23 -0400 (0:00:06.086) 0:11:47.502 ******* fatal: [managed-node4]: FAILED! => { "changed": false } MSG: {u'_ansible_no_log': False, u'crypts': [], u'pools': [], u'leaves': [], u'changed': False, u'actions': [], u'failed': True, u'volumes': [], u'invocation': {u'module_args': {u'packages_only': False, u'uses_kmod_kvdo': True, u'disklabel_type': None, u'safe_mode': True, u'diskvolume_mkfs_option_map': {u'ext4': u'-F', u'ext3': u'-F', u'ext2': u'-F'}, u'pools': [{u'raid_metadata_version': None, u'encryption_key_size': None, u'encryption_key': None, u'encryption_luks_version': None, u'encryption_tang_url': None, u'raid_spare_count': None, u'grow_to_fill': False, u'encryption_tang_thumbprint': None, u'name': u'foo', u'encryption_password': None, u'encryption': False, u'disks': [u'sda'], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'volumes': [{u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': u'4g', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': u'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', u'encryption': False, u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'part_type': None, u'mount_mode': None, u'thin_pool_name': None, u'type': u'partition', u'encryption_cipher': None, u'deduplication': None, u'vdo_pool_size': None, u'encryption_key_size': 0, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': u'luks1', u'raid_stripe_size': None, u'mount_passno': 0, u'mount_user': None, u'raid_spare_count': None, u'name': u'test1', u'cache_mode': None, u'raid_disks': [], u'mount_group': None, u'fs_overwrite_existing': True, u'disks': [u'sda'], u'cached': False, u'thin_pool_size': None, u'thin': False, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}], u'shared': False, u'encryption_clevis_pin': None, u'type': u'partition', u'encryption_cipher': None, u'raid_chunk_size': None}], u'volumes': [], u'pool_defaults': {u'raid_metadata_version': None, u'encryption_cipher': None, u'encryption_key': None, u'encryption_luks_version': None, u'raid_spare_count': None, u'grow_to_fill': False, u'encryption_password': None, u'encryption': False, u'disks': [], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'shared': False, u'type': u'lvm', u'encryption_key_size': None, u'raid_chunk_size': None}, u'volume_defaults': {u'raid_metadata_version': None, u'raid_level': None, u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': 0, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'encryption': False, u'mount_device_identifier': u'uuid', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'thin_pool_name': None, u'type': u'lvm', u'encryption_key_size': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'raid_stripe_size': None, u'cache_size': 0, u'raid_spare_count': None, u'cache_mode': None, u'deduplication': None, u'cached': False, u'fs_overwrite_existing': True, u'disks': [], u'thin_pool_size': None, u'thin': None, u'mount_check': 0, u'mount_passno': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}, u'use_partitions': None}}, u'mounts': [], u'packages': [], u'msg': u"cannot remove existing formatting on device 'luks-899c6988-aa8f-4f93-9423-a0a1748937ea' in safe mode due to encryption removal"} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Wednesday 18 March 2026 19:50:23 -0400 (0:00:00.473) 0:11:47.975 ******* changed: [managed-node4] => (item=systemd-cryptsetup@luks\x2d245394af\x2d792c\x2d404d\x2d9215\x2ddb78c843f58b.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d245394af\\x2d792c\\x2d404d\\x2d9215\\x2ddb78c843f58b.service", "name": "systemd-cryptsetup@luks\\x2d245394af\\x2d792c\\x2d404d\\x2d9215\\x2ddb78c843f58b.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ControlPID": "0", "DefaultDependencies": "yes", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d245394af\\x2d792c\\x2d404d\\x2d9215\\x2ddb78c843f58b.service", "DevicePolicy": "auto", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2d245394af\\x2d792c\\x2d404d\\x2d9215\\x2ddb78c843f58b.service", "IgnoreOnIsolate": "no", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "1048576", "LimitNPROC": "14311", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "14311", "LimitSTACK": "18446744073709551615", "LoadState": "masked", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2d245394af\\x2d792c\\x2d404d\\x2d9215\\x2ddb78c843f58b.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UMask": "0022", "UnitFileState": "bad", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:22 Wednesday 18 March 2026 19:50:25 -0400 (0:00:02.058) 0:11:50.034 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:27 Wednesday 18 March 2026 19:50:26 -0400 (0:00:00.394) 0:11:50.428 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:38 Wednesday 18 March 2026 19:50:26 -0400 (0:00:00.561) 0:11:50.990 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Wednesday 18 March 2026 19:50:27 -0400 (0:00:00.372) 0:11:51.362 ******* ok: [managed-node4] => { "changed": false, "stat": { "atime": 1773877786.7560334, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1773877786.7560334, "dev": 64768, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 67, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1773877786.7560334, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "1615306365", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Wednesday 18 March 2026 19:50:29 -0400 (0:00:01.810) 0:11:53.173 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Remove the encryption layer - 2] ***************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:258 Wednesday 18 March 2026 19:50:29 -0400 (0:00:00.450) 0:11:53.623 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node4 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:24 Wednesday 18 March 2026 19:50:30 -0400 (0:00:01.137) 0:11:54.761 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:34 Wednesday 18 March 2026 19:50:30 -0400 (0:00:00.294) 0:11:55.055 ******* TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Wednesday 18 March 2026 19:50:31 -0400 (0:00:00.402) 0:11:55.457 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Wednesday 18 March 2026 19:50:31 -0400 (0:00:00.326) 0:11:55.784 ******* ok: [managed-node4] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Wednesday 18 March 2026 19:50:34 -0400 (0:00:03.055) 0:11:58.840 ******* skipping: [managed-node4] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node4] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node4] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-loop", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed-node4] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Wednesday 18 March 2026 19:50:35 -0400 (0:00:00.670) 0:11:59.510 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Wednesday 18 March 2026 19:50:35 -0400 (0:00:00.356) 0:11:59.867 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Wednesday 18 March 2026 19:50:36 -0400 (0:00:00.320) 0:12:00.188 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Wednesday 18 March 2026 19:50:36 -0400 (0:00:00.341) 0:12:00.529 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Wednesday 18 March 2026 19:50:36 -0400 (0:00:00.343) 0:12:00.872 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Wednesday 18 March 2026 19:50:37 -0400 (0:00:00.711) 0:12:01.584 ******* ok: [managed-node4] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-loop-2.18-5.el7.x86_64 providing libblockdev-loop is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Wednesday 18 March 2026 19:50:41 -0400 (0:00:03.663) 0:12:05.248 ******* ok: [managed-node4] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": false, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Wednesday 18 March 2026 19:50:41 -0400 (0:00:00.374) 0:12:05.622 ******* ok: [managed-node4] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Wednesday 18 March 2026 19:50:41 -0400 (0:00:00.263) 0:12:05.885 ******* ok: [managed-node4] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Wednesday 18 March 2026 19:50:47 -0400 (0:00:05.445) 0:12:11.331 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Wednesday 18 March 2026 19:50:47 -0400 (0:00:00.424) 0:12:11.755 ******* TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Wednesday 18 March 2026 19:50:47 -0400 (0:00:00.255) 0:12:12.010 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Wednesday 18 March 2026 19:50:48 -0400 (0:00:00.213) 0:12:12.223 ******* TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Wednesday 18 March 2026 19:50:48 -0400 (0:00:00.203) 0:12:12.427 ******* ok: [managed-node4] => { "changed": false, "rc": 0, "results": [ "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Wednesday 18 March 2026 19:50:51 -0400 (0:00:03.143) 0:12:15.570 ******* ok: [managed-node4] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-cryptsetup@luks\\x2d899c6988\\x2daa8f\\x2d4f93\\x2d9423\\x2da0a1748937ea.service": { "name": "systemd-cryptsetup@luks\\x2d899c6988\\x2daa8f\\x2d4f93\\x2d9423\\x2da0a1748937ea.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Wednesday 18 March 2026 19:50:54 -0400 (0:00:02.908) 0:12:18.479 ******* changed: [managed-node4] => (item=systemd-cryptsetup@luks\x2d899c6988\x2daa8f\x2d4f93\x2d9423\x2da0a1748937ea.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d899c6988\\x2daa8f\\x2d4f93\\x2d9423\\x2da0a1748937ea.service", "name": "systemd-cryptsetup@luks\\x2d899c6988\\x2daa8f\\x2d4f93\\x2d9423\\x2da0a1748937ea.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "systemd-readahead-collect.service system-systemd\\x2dcryptsetup.slice cryptsetup-pre.target dev-sda1.device systemd-journald.socket systemd-readahead-replay.service", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "umount.target cryptsetup.target", "BindsTo": "dev-sda1.device", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "Delegate": "no", "Description": "Cryptography Setup for luks-899c6988-aa8f-4f93-9423-a0a1748937ea", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-899c6988-aa8f-4f93-9423-a0a1748937ea /dev/sda1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-899c6988-aa8f-4f93-9423-a0a1748937ea ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d899c6988\\x2daa8f\\x2d4f93\\x2d9423\\x2da0a1748937ea.service", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2d899c6988\\x2daa8f\\x2d4f93\\x2d9423\\x2da0a1748937ea.service", "IgnoreOnIsolate": "yes", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "4096", "LimitNPROC": "14311", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "14311", "LimitSTACK": "18446744073709551615", "LoadState": "loaded", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2d899c6988\\x2daa8f\\x2d4f93\\x2d9423\\x2da0a1748937ea.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RequiredBy": "dev-mapper-luks\\x2d899c6988\\x2daa8f\\x2d4f93\\x2d9423\\x2da0a1748937ea.device cryptsetup.target", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "0", "TimeoutStopUSec": "0", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "bad", "WantedBy": "dev-sda1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Wednesday 18 March 2026 19:50:57 -0400 (0:00:02.619) 0:12:21.099 ******* changed: [managed-node4] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-899c6988-aa8f-4f93-9423-a0a1748937ea", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-899c6988-aa8f-4f93-9423-a0a1748937ea", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-899c6988-aa8f-4f93-9423-a0a1748937ea", "password": "-", "state": "absent" } ], "leaves": [ "/dev/sda1", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-899c6988-aa8f-4f93-9423-a0a1748937ea", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=8147f929-8743-457e-91a7-9bb117717796", "state": "mounted" } ], "packages": [ "xfsprogs", "e2fsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=8147f929-8743-457e-91a7-9bb117717796", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 Wednesday 18 March 2026 19:51:03 -0400 (0:00:06.468) 0:12:27.567 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Wednesday 18 March 2026 19:51:03 -0400 (0:00:00.403) 0:12:27.971 ******* ok: [managed-node4] => { "changed": false, "stat": { "atime": 1773877697.4097798, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "23e5bee53c5ec42279be88e1e27b02b3cae98203", "ctime": 1773877697.4067798, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263969, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1773877697.4067798, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1311, "uid": 0, "version": "18446744071680134064", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:97 Wednesday 18 March 2026 19:51:05 -0400 (0:00:01.901) 0:12:29.872 ******* ok: [managed-node4] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Wednesday 18 March 2026 19:51:07 -0400 (0:00:02.089) 0:12:31.962 ******* changed: [managed-node4] => (item=systemd-cryptsetup@luks\x2d899c6988\x2daa8f\x2d4f93\x2d9423\x2da0a1748937ea.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d899c6988\\x2daa8f\\x2d4f93\\x2d9423\\x2da0a1748937ea.service", "name": "systemd-cryptsetup@luks\\x2d899c6988\\x2daa8f\\x2d4f93\\x2d9423\\x2da0a1748937ea.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ControlPID": "0", "DefaultDependencies": "yes", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d899c6988\\x2daa8f\\x2d4f93\\x2d9423\\x2da0a1748937ea.service", "DevicePolicy": "auto", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2d899c6988\\x2daa8f\\x2d4f93\\x2d9423\\x2da0a1748937ea.service", "IgnoreOnIsolate": "no", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "1048576", "LimitNPROC": "14311", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "14311", "LimitSTACK": "18446744073709551615", "LoadState": "masked", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2d899c6988\\x2daa8f\\x2d4f93\\x2d9423\\x2da0a1748937ea.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RequiredBy": "dev-mapper-luks\\x2d899c6988\\x2daa8f\\x2d4f93\\x2d9423\\x2da0a1748937ea.device cryptsetup.target", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UMask": "0022", "UnitFileState": "bad", "WantedBy": "dev-sda1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Wednesday 18 March 2026 19:51:10 -0400 (0:00:02.211) 0:12:34.174 ******* ok: [managed-node4] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-899c6988-aa8f-4f93-9423-a0a1748937ea", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-899c6988-aa8f-4f93-9423-a0a1748937ea", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-899c6988-aa8f-4f93-9423-a0a1748937ea", "password": "-", "state": "absent" } ], "failed": false, "leaves": [ "/dev/sda1", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-899c6988-aa8f-4f93-9423-a0a1748937ea", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=8147f929-8743-457e-91a7-9bb117717796", "state": "mounted" } ], "packages": [ "xfsprogs", "e2fsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=8147f929-8743-457e-91a7-9bb117717796", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:130 Wednesday 18 March 2026 19:51:10 -0400 (0:00:00.458) 0:12:34.632 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=8147f929-8743-457e-91a7-9bb117717796", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:134 Wednesday 18 March 2026 19:51:10 -0400 (0:00:00.417) 0:12:35.049 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:150 Wednesday 18 March 2026 19:51:11 -0400 (0:00:00.299) 0:12:35.349 ******* changed: [managed-node4] => (item={u'src': u'/dev/mapper/luks-899c6988-aa8f-4f93-9423-a0a1748937ea', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-899c6988-aa8f-4f93-9423-a0a1748937ea", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-899c6988-aa8f-4f93-9423-a0a1748937ea" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Wednesday 18 March 2026 19:51:13 -0400 (0:00:02.130) 0:12:37.480 ******* ok: [managed-node4] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166 Wednesday 18 March 2026 19:51:15 -0400 (0:00:02.167) 0:12:39.648 ******* changed: [managed-node4] => (item={u'src': u'UUID=8147f929-8743-457e-91a7-9bb117717796', u'group': None, u'dump': 0, u'passno': 0, u'fstype': u'xfs', u'state': u'mounted', u'mode': None, u'owner': None, u'path': u'/opt/test1', u'opts': u'defaults'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=8147f929-8743-457e-91a7-9bb117717796", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=8147f929-8743-457e-91a7-9bb117717796" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:177 Wednesday 18 March 2026 19:51:17 -0400 (0:00:01.978) 0:12:41.626 ******* skipping: [managed-node4] => (item={u'src': u'UUID=8147f929-8743-457e-91a7-9bb117717796', u'group': None, u'dump': 0, u'passno': 0, u'fstype': u'xfs', u'state': u'mounted', u'mode': None, u'owner': None, u'path': u'/opt/test1', u'opts': u'defaults'}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=8147f929-8743-457e-91a7-9bb117717796", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:189 Wednesday 18 March 2026 19:51:17 -0400 (0:00:00.321) 0:12:41.948 ******* ok: [managed-node4] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 Wednesday 18 March 2026 19:51:20 -0400 (0:00:02.258) 0:12:44.207 ******* ok: [managed-node4] => { "changed": false, "stat": { "atime": 1773877715.9308326, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "a1aaf77c8d6ca23be4b13d3c5b00af4954ff48d7", "ctime": 1773877705.0638018, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 264032, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1773877705.0638018, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 54, "uid": 0, "version": "18446744071680135716", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:202 Wednesday 18 March 2026 19:51:21 -0400 (0:00:01.523) 0:12:45.731 ******* changed: [managed-node4] => (item={u'state': u'absent', u'password': u'-', u'name': u'luks-899c6988-aa8f-4f93-9423-a0a1748937ea', u'backing_device': u'/dev/sda1'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda1", "name": "luks-899c6988-aa8f-4f93-9423-a0a1748937ea", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 Wednesday 18 March 2026 19:51:23 -0400 (0:00:01.870) 0:12:47.601 ******* ok: [managed-node4] TASK [Verify role results - 5] ************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:274 Wednesday 18 March 2026 19:51:26 -0400 (0:00:02.759) 0:12:50.360 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node4 TASK [Print out pool information] ********************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Wednesday 18 March 2026 19:51:27 -0400 (0:00:01.056) 0:12:51.417 ******* ok: [managed-node4] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=8147f929-8743-457e-91a7-9bb117717796", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Wednesday 18 March 2026 19:51:27 -0400 (0:00:00.574) 0:12:51.991 ******* skipping: [managed-node4] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Wednesday 18 March 2026 19:51:28 -0400 (0:00:00.301) 0:12:52.292 ******* ok: [managed-node4] => { "changed": false, "info": { "/dev/sda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/sda1", "size": "4G", "type": "partition", "uuid": "8147f929-8743-457e-91a7-9bb117717796" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "ext4", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "c7b7d6a5-fd01-4b9b-bcca-153eaff9d312" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Wednesday 18 March 2026 19:51:30 -0400 (0:00:01.877) 0:12:54.170 ******* ok: [managed-node4] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002705", "end": "2026-03-18 19:51:31.596734", "rc": 0, "start": "2026-03-18 19:51:31.594029" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Thu Jun 20 10:23:46 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk' # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info # UUID=c7b7d6a5-fd01-4b9b-bcca-153eaff9d312 / ext4 defaults 1 1 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 UUID=8147f929-8743-457e-91a7-9bb117717796 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Wednesday 18 March 2026 19:51:31 -0400 (0:00:01.869) 0:12:56.040 ******* ok: [managed-node4] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002502", "end": "2026-03-18 19:51:33.150878", "failed_when_result": false, "rc": 0, "start": "2026-03-18 19:51:33.148376" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Wednesday 18 March 2026 19:51:33 -0400 (0:00:01.560) 0:12:57.601 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node4 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Wednesday 18 March 2026 19:51:34 -0400 (0:00:00.543) 0:12:58.145 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Wednesday 18 March 2026 19:51:34 -0400 (0:00:00.192) 0:12:58.337 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Wednesday 18 March 2026 19:51:34 -0400 (0:00:00.342) 0:12:58.679 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Wednesday 18 March 2026 19:51:34 -0400 (0:00:00.248) 0:12:58.928 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node4 included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node4 TASK [Set test variables] ****************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Wednesday 18 March 2026 19:51:35 -0400 (0:00:00.703) 0:12:59.632 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Wednesday 18 March 2026 19:51:35 -0400 (0:00:00.328) 0:12:59.961 ******* TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Wednesday 18 March 2026 19:51:36 -0400 (0:00:00.303) 0:13:00.264 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Wednesday 18 March 2026 19:51:36 -0400 (0:00:00.396) 0:13:00.661 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Wednesday 18 March 2026 19:51:36 -0400 (0:00:00.346) 0:13:01.007 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Wednesday 18 March 2026 19:51:37 -0400 (0:00:00.324) 0:13:01.332 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type - 2] ************************************************ task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Wednesday 18 March 2026 19:51:37 -0400 (0:00:00.382) 0:13:01.714 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type - 3] ************************************************ task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Wednesday 18 March 2026 19:51:37 -0400 (0:00:00.347) 0:13:02.062 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:55 Wednesday 18 March 2026 19:51:38 -0400 (0:00:00.318) 0:13:02.380 ******* TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:68 Wednesday 18 March 2026 19:51:38 -0400 (0:00:00.349) 0:13:02.730 ******* ok: [managed-node4] => { "changed": false, "failed_when_result": false, "rc": 1 } STDERR: Shared connection to 10.31.44.104 closed. MSG: non-zero return code TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:78 Wednesday 18 March 2026 19:51:40 -0400 (0:00:02.128) 0:13:04.858 ******* TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:88 Wednesday 18 March 2026 19:51:41 -0400 (0:00:00.329) 0:13:05.188 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node4 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Wednesday 18 March 2026 19:51:41 -0400 (0:00:00.812) 0:13:06.001 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Wednesday 18 March 2026 19:51:42 -0400 (0:00:00.391) 0:13:06.392 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Wednesday 18 March 2026 19:51:42 -0400 (0:00:00.382) 0:13:06.774 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Wednesday 18 March 2026 19:51:43 -0400 (0:00:00.397) 0:13:07.172 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Wednesday 18 March 2026 19:51:43 -0400 (0:00:00.375) 0:13:07.547 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Wednesday 18 March 2026 19:51:43 -0400 (0:00:00.444) 0:13:07.992 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Wednesday 18 March 2026 19:51:44 -0400 (0:00:00.308) 0:13:08.300 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Wednesday 18 March 2026 19:51:44 -0400 (0:00:00.339) 0:13:08.640 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Wednesday 18 March 2026 19:51:44 -0400 (0:00:00.409) 0:13:09.050 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Wednesday 18 March 2026 19:51:45 -0400 (0:00:00.278) 0:13:09.329 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Wednesday 18 March 2026 19:51:45 -0400 (0:00:00.348) 0:13:09.678 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:91 Wednesday 18 March 2026 19:51:45 -0400 (0:00:00.302) 0:13:09.980 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node4 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Wednesday 18 March 2026 19:51:46 -0400 (0:00:00.743) 0:13:10.724 ******* skipping: [managed-node4] => (item={u'_raw_device': u'/dev/sda1', u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'xfs', u'mount_options': u'defaults', u'_device': u'/dev/sda1', u'size': u'4g', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': u'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', u'_kernel_device': u'/dev/sda1', u'encryption': False, u'raid_level': None, u'name': u'test1', u'raid_device_count': None, u'state': u'present', u'part_type': None, u'mount_mode': None, u'thin_pool_name': None, u'type': u'partition', u'encryption_cipher': None, u'vdo_pool_size': None, u'encryption_key_size': 0, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': u'luks1', u'raid_stripe_size': None, u'mount_passno': 0, u'_mount_id': u'UUID=8147f929-8743-457e-91a7-9bb117717796', u'mount_user': None, u'raid_spare_count': None, u'raid_disks': [], u'_raw_kernel_device': u'/dev/sda1', u'cache_mode': None, u'cache_devices': [], u'deduplication': None, u'mount_group': None, u'thin_pool_size': None, u'disks': [u'sda'], u'cached': False, u'thin': False, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'fs_overwrite_existing': True, u'fs_create_options': u''}) => { "ansible_loop_var": "storage_test_lvmraid_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_lvmraid_volume": { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=8147f929-8743-457e-91a7-9bb117717796", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:94 Wednesday 18 March 2026 19:51:47 -0400 (0:00:00.498) 0:13:11.222 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node4 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Wednesday 18 March 2026 19:51:48 -0400 (0:00:00.878) 0:13:12.101 ******* skipping: [managed-node4] => (item={u'_raw_device': u'/dev/sda1', u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'xfs', u'mount_options': u'defaults', u'_device': u'/dev/sda1', u'size': u'4g', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': u'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', u'_kernel_device': u'/dev/sda1', u'encryption': False, u'raid_level': None, u'name': u'test1', u'raid_device_count': None, u'state': u'present', u'part_type': None, u'mount_mode': None, u'thin_pool_name': None, u'type': u'partition', u'encryption_cipher': None, u'vdo_pool_size': None, u'encryption_key_size': 0, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': u'luks1', u'raid_stripe_size': None, u'mount_passno': 0, u'_mount_id': u'UUID=8147f929-8743-457e-91a7-9bb117717796', u'mount_user': None, u'raid_spare_count': None, u'raid_disks': [], u'_raw_kernel_device': u'/dev/sda1', u'cache_mode': None, u'cache_devices': [], u'deduplication': None, u'mount_group': None, u'thin_pool_size': None, u'disks': [u'sda'], u'cached': False, u'thin': False, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'fs_overwrite_existing': True, u'fs_create_options': u''}) => { "ansible_loop_var": "storage_test_thin_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_thin_volume": { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=8147f929-8743-457e-91a7-9bb117717796", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check member encryption] ************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:97 Wednesday 18 March 2026 19:51:48 -0400 (0:00:00.472) 0:13:12.573 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node4 TASK [Set test variables] ****************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Wednesday 18 March 2026 19:51:49 -0400 (0:00:00.937) 0:13:13.510 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Wednesday 18 March 2026 19:51:49 -0400 (0:00:00.527) 0:13:14.038 ******* TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Wednesday 18 March 2026 19:51:50 -0400 (0:00:00.152) 0:13:14.191 ******* TASK [Clear test variables] **************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Wednesday 18 March 2026 19:51:50 -0400 (0:00:00.262) 0:13:14.454 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:100 Wednesday 18 March 2026 19:51:50 -0400 (0:00:00.288) 0:13:14.742 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node4 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Wednesday 18 March 2026 19:51:51 -0400 (0:00:00.745) 0:13:15.488 ******* skipping: [managed-node4] => (item={u'_raw_device': u'/dev/sda1', u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'xfs', u'mount_options': u'defaults', u'_device': u'/dev/sda1', u'size': u'4g', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': u'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', u'_kernel_device': u'/dev/sda1', u'encryption': False, u'raid_level': None, u'name': u'test1', u'raid_device_count': None, u'state': u'present', u'part_type': None, u'mount_mode': None, u'thin_pool_name': None, u'type': u'partition', u'encryption_cipher': None, u'vdo_pool_size': None, u'encryption_key_size': 0, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': u'luks1', u'raid_stripe_size': None, u'mount_passno': 0, u'_mount_id': u'UUID=8147f929-8743-457e-91a7-9bb117717796', u'mount_user': None, u'raid_spare_count': None, u'raid_disks': [], u'_raw_kernel_device': u'/dev/sda1', u'cache_mode': None, u'cache_devices': [], u'deduplication': None, u'mount_group': None, u'thin_pool_size': None, u'disks': [u'sda'], u'cached': False, u'thin': False, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'fs_overwrite_existing': True, u'fs_create_options': u''}) => { "ansible_loop_var": "storage_test_vdo_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_vdo_volume": { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=8147f929-8743-457e-91a7-9bb117717796", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:103 Wednesday 18 March 2026 19:51:51 -0400 (0:00:00.481) 0:13:15.970 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node4 TASK [Get stratis pool information] ******************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Wednesday 18 March 2026 19:51:53 -0400 (0:00:01.771) 0:13:17.742 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print script output] ***************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Wednesday 18 March 2026 19:51:54 -0400 (0:00:00.451) 0:13:18.193 ******* skipping: [managed-node4] => {} TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:19 Wednesday 18 March 2026 19:51:54 -0400 (0:00:00.354) 0:13:18.548 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:23 Wednesday 18 March 2026 19:51:54 -0400 (0:00:00.249) 0:13:18.797 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:30 Wednesday 18 March 2026 19:51:55 -0400 (0:00:00.322) 0:13:19.120 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:39 Wednesday 18 March 2026 19:51:55 -0400 (0:00:00.336) 0:13:19.456 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:49 Wednesday 18 March 2026 19:51:55 -0400 (0:00:00.322) 0:13:19.778 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:106 Wednesday 18 March 2026 19:51:55 -0400 (0:00:00.287) 0:13:20.066 ******* ok: [managed-node4] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Wednesday 18 March 2026 19:51:56 -0400 (0:00:00.320) 0:13:20.387 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node4 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Wednesday 18 March 2026 19:51:57 -0400 (0:00:00.716) 0:13:21.103 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Wednesday 18 March 2026 19:51:57 -0400 (0:00:00.496) 0:13:21.599 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node4 included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node4 included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node4 included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node4 included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node4 included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node4 included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node4 included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node4 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Wednesday 18 March 2026 19:51:59 -0400 (0:00:01.824) 0:13:23.424 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_device_path": "/dev/sda1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Wednesday 18 March 2026 19:51:59 -0400 (0:00:00.250) 0:13:23.674 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Wednesday 18 March 2026 19:52:00 -0400 (0:00:00.544) 0:13:24.219 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Wednesday 18 March 2026 19:52:00 -0400 (0:00:00.425) 0:13:24.644 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Wednesday 18 March 2026 19:52:00 -0400 (0:00:00.411) 0:13:25.056 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Wednesday 18 March 2026 19:52:01 -0400 (0:00:00.504) 0:13:25.560 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Wednesday 18 March 2026 19:52:01 -0400 (0:00:00.468) 0:13:26.029 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Wednesday 18 March 2026 19:52:02 -0400 (0:00:00.433) 0:13:26.462 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Wednesday 18 March 2026 19:52:02 -0400 (0:00:00.320) 0:13:26.782 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Wednesday 18 March 2026 19:52:03 -0400 (0:00:00.343) 0:13:27.126 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Wednesday 18 March 2026 19:52:03 -0400 (0:00:00.356) 0:13:27.482 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Wednesday 18 March 2026 19:52:03 -0400 (0:00:00.287) 0:13:27.770 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "UUID=8147f929-8743-457e-91a7-9bb117717796 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Wednesday 18 March 2026 19:52:04 -0400 (0:00:00.635) 0:13:28.406 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Wednesday 18 March 2026 19:52:04 -0400 (0:00:00.426) 0:13:28.832 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Wednesday 18 March 2026 19:52:05 -0400 (0:00:00.495) 0:13:29.328 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Wednesday 18 March 2026 19:52:05 -0400 (0:00:00.327) 0:13:29.656 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Wednesday 18 March 2026 19:52:05 -0400 (0:00:00.392) 0:13:30.049 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Wednesday 18 March 2026 19:52:06 -0400 (0:00:00.324) 0:13:30.373 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Wednesday 18 March 2026 19:52:06 -0400 (0:00:00.325) 0:13:30.699 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Wednesday 18 March 2026 19:52:07 -0400 (0:00:00.442) 0:13:31.142 ******* ok: [managed-node4] => { "changed": false, "stat": { "atime": 1773877863.10025, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1773877863.10025, "dev": 5, "device_type": 2049, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 146094, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1773877863.10025, "nlink": 1, "path": "/dev/sda1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Wednesday 18 March 2026 19:52:09 -0400 (0:00:02.123) 0:13:33.265 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Wednesday 18 March 2026 19:52:09 -0400 (0:00:00.499) 0:13:33.765 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Wednesday 18 March 2026 19:52:10 -0400 (0:00:00.412) 0:13:34.178 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Wednesday 18 March 2026 19:52:10 -0400 (0:00:00.433) 0:13:34.611 ******* ok: [managed-node4] => { "ansible_facts": { "st_volume_type": "partition" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Wednesday 18 March 2026 19:52:11 -0400 (0:00:00.514) 0:13:35.126 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Wednesday 18 March 2026 19:52:11 -0400 (0:00:00.340) 0:13:35.467 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Wednesday 18 March 2026 19:52:11 -0400 (0:00:00.462) 0:13:35.930 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Wednesday 18 March 2026 19:52:12 -0400 (0:00:00.399) 0:13:36.329 ******* ok: [managed-node4] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed" ] } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Wednesday 18 March 2026 19:52:16 -0400 (0:00:04.043) 0:13:40.373 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Wednesday 18 March 2026 19:52:16 -0400 (0:00:00.350) 0:13:40.724 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Wednesday 18 March 2026 19:52:16 -0400 (0:00:00.291) 0:13:41.015 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Wednesday 18 March 2026 19:52:17 -0400 (0:00:00.530) 0:13:41.545 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Wednesday 18 March 2026 19:52:17 -0400 (0:00:00.224) 0:13:41.770 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Wednesday 18 March 2026 19:52:18 -0400 (0:00:00.356) 0:13:42.127 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Wednesday 18 March 2026 19:52:18 -0400 (0:00:00.361) 0:13:42.488 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Wednesday 18 March 2026 19:52:18 -0400 (0:00:00.357) 0:13:42.846 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Wednesday 18 March 2026 19:52:19 -0400 (0:00:00.357) 0:13:43.203 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Wednesday 18 March 2026 19:52:19 -0400 (0:00:00.385) 0:13:43.589 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Wednesday 18 March 2026 19:52:19 -0400 (0:00:00.442) 0:13:44.031 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Wednesday 18 March 2026 19:52:20 -0400 (0:00:00.356) 0:13:44.388 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Wednesday 18 March 2026 19:52:20 -0400 (0:00:00.288) 0:13:44.677 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Wednesday 18 March 2026 19:52:20 -0400 (0:00:00.368) 0:13:45.046 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Wednesday 18 March 2026 19:52:21 -0400 (0:00:00.413) 0:13:45.459 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Wednesday 18 March 2026 19:52:21 -0400 (0:00:00.402) 0:13:45.862 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Wednesday 18 March 2026 19:52:22 -0400 (0:00:00.411) 0:13:46.273 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Wednesday 18 March 2026 19:52:22 -0400 (0:00:00.371) 0:13:46.645 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Wednesday 18 March 2026 19:52:22 -0400 (0:00:00.269) 0:13:46.914 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Wednesday 18 March 2026 19:52:23 -0400 (0:00:00.251) 0:13:47.166 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Wednesday 18 March 2026 19:52:23 -0400 (0:00:00.387) 0:13:47.553 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Wednesday 18 March 2026 19:52:23 -0400 (0:00:00.341) 0:13:47.895 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Wednesday 18 March 2026 19:52:24 -0400 (0:00:00.236) 0:13:48.131 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Wednesday 18 March 2026 19:52:24 -0400 (0:00:00.396) 0:13:48.528 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Wednesday 18 March 2026 19:52:24 -0400 (0:00:00.354) 0:13:48.883 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Wednesday 18 March 2026 19:52:25 -0400 (0:00:00.370) 0:13:49.253 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Wednesday 18 March 2026 19:52:25 -0400 (0:00:00.361) 0:13:49.614 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Wednesday 18 March 2026 19:52:25 -0400 (0:00:00.456) 0:13:50.071 ******* ok: [managed-node4] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Wednesday 18 March 2026 19:52:26 -0400 (0:00:00.282) 0:13:50.354 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Wednesday 18 March 2026 19:52:26 -0400 (0:00:00.297) 0:13:50.651 ******* skipping: [managed-node4] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Wednesday 18 March 2026 19:52:26 -0400 (0:00:00.303) 0:13:50.955 ******* skipping: [managed-node4] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Wednesday 18 March 2026 19:52:27 -0400 (0:00:00.176) 0:13:51.132 ******* skipping: [managed-node4] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Wednesday 18 March 2026 19:52:27 -0400 (0:00:00.289) 0:13:51.422 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Wednesday 18 March 2026 19:52:27 -0400 (0:00:00.228) 0:13:51.650 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Wednesday 18 March 2026 19:52:27 -0400 (0:00:00.346) 0:13:51.997 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Wednesday 18 March 2026 19:52:28 -0400 (0:00:00.329) 0:13:52.327 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Wednesday 18 March 2026 19:52:28 -0400 (0:00:00.227) 0:13:52.555 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Wednesday 18 March 2026 19:52:28 -0400 (0:00:00.267) 0:13:52.822 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Wednesday 18 March 2026 19:52:28 -0400 (0:00:00.242) 0:13:53.065 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Wednesday 18 March 2026 19:52:29 -0400 (0:00:00.417) 0:13:53.482 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Wednesday 18 March 2026 19:52:29 -0400 (0:00:00.156) 0:13:53.639 ******* skipping: [managed-node4] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Wednesday 18 March 2026 19:52:29 -0400 (0:00:00.274) 0:13:53.913 ******* skipping: [managed-node4] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Wednesday 18 March 2026 19:52:30 -0400 (0:00:00.221) 0:13:54.134 ******* skipping: [managed-node4] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Wednesday 18 March 2026 19:52:30 -0400 (0:00:00.189) 0:13:54.324 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Wednesday 18 March 2026 19:52:30 -0400 (0:00:00.178) 0:13:54.502 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Wednesday 18 March 2026 19:52:30 -0400 (0:00:00.240) 0:13:54.743 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Wednesday 18 March 2026 19:52:30 -0400 (0:00:00.304) 0:13:55.047 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Wednesday 18 March 2026 19:52:31 -0400 (0:00:00.230) 0:13:55.278 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Wednesday 18 March 2026 19:52:31 -0400 (0:00:00.193) 0:13:55.472 ******* ok: [managed-node4] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Wednesday 18 March 2026 19:52:31 -0400 (0:00:00.232) 0:13:55.704 ******* ok: [managed-node4] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Wednesday 18 March 2026 19:52:31 -0400 (0:00:00.293) 0:13:55.998 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Wednesday 18 March 2026 19:52:32 -0400 (0:00:00.307) 0:13:56.305 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Wednesday 18 March 2026 19:52:32 -0400 (0:00:00.295) 0:13:56.601 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Wednesday 18 March 2026 19:52:32 -0400 (0:00:00.381) 0:13:56.982 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Wednesday 18 March 2026 19:52:33 -0400 (0:00:00.340) 0:13:57.322 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Wednesday 18 March 2026 19:52:33 -0400 (0:00:00.274) 0:13:57.597 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Wednesday 18 March 2026 19:52:33 -0400 (0:00:00.202) 0:13:57.800 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Wednesday 18 March 2026 19:52:33 -0400 (0:00:00.247) 0:13:58.048 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Wednesday 18 March 2026 19:52:34 -0400 (0:00:00.294) 0:13:58.342 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Wednesday 18 March 2026 19:52:34 -0400 (0:00:00.246) 0:13:58.589 ******* TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Wednesday 18 March 2026 19:52:34 -0400 (0:00:00.181) 0:13:58.771 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Wednesday 18 March 2026 19:52:34 -0400 (0:00:00.184) 0:13:58.955 ******* changed: [managed-node4] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode - 4] ****************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:280 Wednesday 18 March 2026 19:52:36 -0400 (0:00:01.959) 0:14:00.915 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node4 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Wednesday 18 March 2026 19:52:37 -0400 (0:00:00.675) 0:14:01.590 ******* ok: [managed-node4] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Wednesday 18 March 2026 19:52:37 -0400 (0:00:00.388) 0:14:01.979 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node4 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:24 Wednesday 18 March 2026 19:52:38 -0400 (0:00:00.344) 0:14:02.324 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:34 Wednesday 18 March 2026 19:52:38 -0400 (0:00:00.233) 0:14:02.557 ******* TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Wednesday 18 March 2026 19:52:38 -0400 (0:00:00.287) 0:14:02.845 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Wednesday 18 March 2026 19:52:39 -0400 (0:00:00.437) 0:14:03.282 ******* ok: [managed-node4] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Wednesday 18 March 2026 19:52:41 -0400 (0:00:02.574) 0:14:05.857 ******* skipping: [managed-node4] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node4] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node4] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-loop", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed-node4] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Wednesday 18 March 2026 19:52:42 -0400 (0:00:00.534) 0:14:06.391 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Wednesday 18 March 2026 19:52:42 -0400 (0:00:00.268) 0:14:06.659 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Wednesday 18 March 2026 19:52:42 -0400 (0:00:00.256) 0:14:06.916 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Wednesday 18 March 2026 19:52:43 -0400 (0:00:00.244) 0:14:07.160 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Wednesday 18 March 2026 19:52:43 -0400 (0:00:00.334) 0:14:07.495 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Wednesday 18 March 2026 19:52:44 -0400 (0:00:00.747) 0:14:08.242 ******* ok: [managed-node4] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-loop-2.18-5.el7.x86_64 providing libblockdev-loop is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Wednesday 18 March 2026 19:52:47 -0400 (0:00:03.368) 0:14:11.610 ******* ok: [managed-node4] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Wednesday 18 March 2026 19:52:47 -0400 (0:00:00.455) 0:14:12.066 ******* ok: [managed-node4] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Wednesday 18 March 2026 19:52:48 -0400 (0:00:00.465) 0:14:12.531 ******* ok: [managed-node4] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Wednesday 18 March 2026 19:52:54 -0400 (0:00:06.091) 0:14:18.622 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Wednesday 18 March 2026 19:52:55 -0400 (0:00:00.473) 0:14:19.096 ******* TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Wednesday 18 March 2026 19:52:55 -0400 (0:00:00.357) 0:14:19.454 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Wednesday 18 March 2026 19:52:55 -0400 (0:00:00.316) 0:14:19.770 ******* TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Wednesday 18 March 2026 19:52:55 -0400 (0:00:00.238) 0:14:20.009 ******* ok: [managed-node4] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed", "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Wednesday 18 March 2026 19:52:59 -0400 (0:00:03.762) 0:14:23.772 ******* ok: [managed-node4] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-cryptsetup@luks\\x2d899c6988\\x2daa8f\\x2d4f93\\x2d9423\\x2da0a1748937ea.service": { "name": "systemd-cryptsetup@luks\\x2d899c6988\\x2daa8f\\x2d4f93\\x2d9423\\x2da0a1748937ea.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Wednesday 18 March 2026 19:53:02 -0400 (0:00:03.041) 0:14:26.813 ******* changed: [managed-node4] => (item=systemd-cryptsetup@luks\x2d899c6988\x2daa8f\x2d4f93\x2d9423\x2da0a1748937ea.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d899c6988\\x2daa8f\\x2d4f93\\x2d9423\\x2da0a1748937ea.service", "name": "systemd-cryptsetup@luks\\x2d899c6988\\x2daa8f\\x2d4f93\\x2d9423\\x2da0a1748937ea.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "dev-sda1.device systemd-journald.socket systemd-readahead-collect.service systemd-readahead-replay.service cryptsetup-pre.target system-systemd\\x2dcryptsetup.slice", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "cryptsetup.target umount.target", "BindsTo": "dev-sda1.device", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "Delegate": "no", "Description": "Cryptography Setup for luks-899c6988-aa8f-4f93-9423-a0a1748937ea", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-899c6988-aa8f-4f93-9423-a0a1748937ea /dev/sda1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-899c6988-aa8f-4f93-9423-a0a1748937ea ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d899c6988\\x2daa8f\\x2d4f93\\x2d9423\\x2da0a1748937ea.service", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2d899c6988\\x2daa8f\\x2d4f93\\x2d9423\\x2da0a1748937ea.service", "IgnoreOnIsolate": "yes", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "4096", "LimitNPROC": "14311", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "14311", "LimitSTACK": "18446744073709551615", "LoadState": "loaded", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2d899c6988\\x2daa8f\\x2d4f93\\x2d9423\\x2da0a1748937ea.service", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RequiredBy": "cryptsetup.target", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "0", "TimeoutStopUSec": "0", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "bad", "WantedBy": "dev-sda1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Wednesday 18 March 2026 19:53:04 -0400 (0:00:02.129) 0:14:28.942 ******* fatal: [managed-node4]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'sda1' in safe mode due to adding encryption TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111 Wednesday 18 March 2026 19:53:10 -0400 (0:00:05.996) 0:14:34.938 ******* fatal: [managed-node4]: FAILED! => { "changed": false } MSG: {u'_ansible_no_log': False, u'crypts': [], u'pools': [], u'leaves': [], u'changed': False, u'actions': [], u'failed': True, u'volumes': [], u'invocation': {u'module_args': {u'packages_only': False, u'uses_kmod_kvdo': True, u'disklabel_type': None, u'safe_mode': True, u'diskvolume_mkfs_option_map': {u'ext4': u'-F', u'ext3': u'-F', u'ext2': u'-F'}, u'pools': [{u'raid_metadata_version': None, u'encryption_key_size': None, u'encryption_key': None, u'encryption_luks_version': None, u'encryption_tang_url': None, u'raid_spare_count': None, u'grow_to_fill': False, u'encryption_tang_thumbprint': None, u'name': u'foo', u'encryption_password': None, u'encryption': False, u'disks': [u'sda'], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'volumes': [{u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': u'4g', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': u'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', u'encryption': True, u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'part_type': None, u'mount_mode': None, u'thin_pool_name': None, u'type': u'partition', u'encryption_cipher': None, u'deduplication': None, u'vdo_pool_size': None, u'encryption_key_size': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'raid_stripe_size': None, u'mount_passno': 0, u'mount_user': None, u'raid_spare_count': None, u'name': u'test1', u'cache_mode': None, u'raid_disks': [], u'mount_group': None, u'fs_overwrite_existing': True, u'disks': [u'sda'], u'cached': False, u'thin_pool_size': None, u'thin': False, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}], u'shared': False, u'encryption_clevis_pin': None, u'type': u'partition', u'encryption_cipher': None, u'raid_chunk_size': None}], u'volumes': [], u'pool_defaults': {u'raid_metadata_version': None, u'encryption_cipher': None, u'encryption_key': None, u'encryption_luks_version': None, u'raid_spare_count': None, u'grow_to_fill': False, u'encryption_password': None, u'encryption': False, u'disks': [], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'shared': False, u'type': u'lvm', u'encryption_key_size': None, u'raid_chunk_size': None}, u'volume_defaults': {u'raid_metadata_version': None, u'raid_level': None, u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': 0, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'encryption': False, u'mount_device_identifier': u'uuid', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'thin_pool_name': None, u'type': u'lvm', u'encryption_key_size': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'raid_stripe_size': None, u'cache_size': 0, u'raid_spare_count': None, u'cache_mode': None, u'deduplication': None, u'cached': False, u'fs_overwrite_existing': True, u'disks': [], u'thin_pool_size': None, u'thin': None, u'mount_check': 0, u'mount_passno': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}, u'use_partitions': None}}, u'mounts': [], u'packages': [], u'msg': u"cannot remove existing formatting on device 'sda1' in safe mode due to adding encryption"} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Wednesday 18 March 2026 19:53:11 -0400 (0:00:00.557) 0:14:35.495 ******* changed: [managed-node4] => (item=systemd-cryptsetup@luks\x2d899c6988\x2daa8f\x2d4f93\x2d9423\x2da0a1748937ea.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d899c6988\\x2daa8f\\x2d4f93\\x2d9423\\x2da0a1748937ea.service", "name": "systemd-cryptsetup@luks\\x2d899c6988\\x2daa8f\\x2d4f93\\x2d9423\\x2da0a1748937ea.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ControlPID": "0", "DefaultDependencies": "yes", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d899c6988\\x2daa8f\\x2d4f93\\x2d9423\\x2da0a1748937ea.service", "DevicePolicy": "auto", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2d899c6988\\x2daa8f\\x2d4f93\\x2d9423\\x2da0a1748937ea.service", "IgnoreOnIsolate": "no", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "1048576", "LimitNPROC": "14311", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "14311", "LimitSTACK": "18446744073709551615", "LoadState": "masked", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2d899c6988\\x2daa8f\\x2d4f93\\x2d9423\\x2da0a1748937ea.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UMask": "0022", "UnitFileState": "bad", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:22 Wednesday 18 March 2026 19:53:13 -0400 (0:00:02.577) 0:14:38.072 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:27 Wednesday 18 March 2026 19:53:14 -0400 (0:00:00.525) 0:14:38.598 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:38 Wednesday 18 March 2026 19:53:15 -0400 (0:00:00.540) 0:14:39.138 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Wednesday 18 March 2026 19:53:15 -0400 (0:00:00.287) 0:14:39.425 ******* ok: [managed-node4] => { "changed": false, "stat": { "atime": 1773877956.3995144, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1773877956.3995144, "dev": 2049, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 67, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1773877956.3995144, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "543421540", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Wednesday 18 March 2026 19:53:17 -0400 (0:00:02.019) 0:14:41.480 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Create a key file] ******************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:306 Wednesday 18 March 2026 19:53:17 -0400 (0:00:00.407) 0:14:41.887 ******* ok: [managed-node4] => { "changed": false, "gid": 0, "group": "root", "mode": "0600", "owner": "root", "path": "/tmp/storage_test45gdlvlukskey", "secontext": "unconfined_u:object_r:user_tmp_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Write the key into the key file] ***************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:313 Wednesday 18 March 2026 19:53:21 -0400 (0:00:04.058) 0:14:45.945 ******* ok: [managed-node4] => { "changed": false, "checksum": "7a4dff3752e2baf5617c57eaac048e2b95e8af91", "dest": "/tmp/storage_test45gdlvlukskey", "gid": 0, "group": "root", "md5sum": "4ac07b967150835c00d0865161e48744", "mode": "0600", "owner": "root", "secontext": "unconfined_u:object_r:user_tmp_t:s0", "size": 32, "src": "/root/.ansible/tmp/ansible-tmp-1773878002.31-23656-111113771237694/source", "state": "file", "uid": 0 } TASK [Add encryption to the volume - 2] **************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:320 Wednesday 18 March 2026 19:53:27 -0400 (0:00:05.938) 0:14:51.884 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node4 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:24 Wednesday 18 March 2026 19:53:28 -0400 (0:00:00.485) 0:14:52.370 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:34 Wednesday 18 March 2026 19:53:28 -0400 (0:00:00.285) 0:14:52.656 ******* TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Wednesday 18 March 2026 19:53:28 -0400 (0:00:00.306) 0:14:52.962 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Wednesday 18 March 2026 19:53:29 -0400 (0:00:00.429) 0:14:53.392 ******* ok: [managed-node4] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Wednesday 18 March 2026 19:53:32 -0400 (0:00:02.816) 0:14:56.209 ******* skipping: [managed-node4] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node4] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node4] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-loop", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed-node4] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Wednesday 18 March 2026 19:53:32 -0400 (0:00:00.811) 0:14:57.020 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Wednesday 18 March 2026 19:53:33 -0400 (0:00:00.259) 0:14:57.279 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Wednesday 18 March 2026 19:53:33 -0400 (0:00:00.338) 0:14:57.618 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Wednesday 18 March 2026 19:53:33 -0400 (0:00:00.280) 0:14:57.898 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Wednesday 18 March 2026 19:53:34 -0400 (0:00:00.350) 0:14:58.249 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Wednesday 18 March 2026 19:53:34 -0400 (0:00:00.635) 0:14:58.885 ******* ok: [managed-node4] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-loop-2.18-5.el7.x86_64 providing libblockdev-loop is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Wednesday 18 March 2026 19:53:38 -0400 (0:00:03.845) 0:15:02.731 ******* ok: [managed-node4] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": true, "encryption_key": "/tmp/storage_test45gdlvlukskey", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Wednesday 18 March 2026 19:53:38 -0400 (0:00:00.362) 0:15:03.094 ******* ok: [managed-node4] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Wednesday 18 March 2026 19:53:39 -0400 (0:00:00.350) 0:15:03.445 ******* ok: [managed-node4] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Wednesday 18 March 2026 19:53:45 -0400 (0:00:05.953) 0:15:09.398 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Wednesday 18 March 2026 19:53:45 -0400 (0:00:00.417) 0:15:09.828 ******* TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Wednesday 18 March 2026 19:53:46 -0400 (0:00:00.338) 0:15:10.166 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Wednesday 18 March 2026 19:53:46 -0400 (0:00:00.338) 0:15:10.504 ******* TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Wednesday 18 March 2026 19:53:46 -0400 (0:00:00.303) 0:15:10.808 ******* ok: [managed-node4] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed", "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Wednesday 18 March 2026 19:53:50 -0400 (0:00:03.807) 0:15:14.615 ******* ok: [managed-node4] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Wednesday 18 March 2026 19:53:53 -0400 (0:00:02.942) 0:15:17.557 ******* TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Wednesday 18 March 2026 19:53:53 -0400 (0:00:00.416) 0:15:17.974 ******* changed: [managed-node4] => { "actions": [ { "action": "destroy format", "device": "/dev/sda1", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-0ebc5dd0-5035-41fd-adcc-ae42ec9792ff", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-0ebc5dd0-5035-41fd-adcc-ae42ec9792ff", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-0ebc5dd0-5035-41fd-adcc-ae42ec9792ff", "password": "/tmp/storage_test45gdlvlukskey", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-0ebc5dd0-5035-41fd-adcc-ae42ec9792ff" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=8147f929-8743-457e-91a7-9bb117717796", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-0ebc5dd0-5035-41fd-adcc-ae42ec9792ff", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "e2fsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-0ebc5dd0-5035-41fd-adcc-ae42ec9792ff", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-0ebc5dd0-5035-41fd-adcc-ae42ec9792ff", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_test45gdlvlukskey", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 Wednesday 18 March 2026 19:54:05 -0400 (0:00:12.095) 0:15:30.070 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Wednesday 18 March 2026 19:54:06 -0400 (0:00:00.266) 0:15:30.337 ******* ok: [managed-node4] => { "changed": false, "stat": { "atime": 1773877877.24229, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "66c0f349b140a7fc214852950b4298bd6113d69a", "ctime": 1773877877.23929, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263969, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1773877877.23929, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1299, "uid": 0, "version": "18446744071680134064", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:97 Wednesday 18 March 2026 19:54:07 -0400 (0:00:01.526) 0:15:31.863 ******* ok: [managed-node4] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Wednesday 18 March 2026 19:54:09 -0400 (0:00:01.513) 0:15:33.376 ******* TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Wednesday 18 March 2026 19:54:09 -0400 (0:00:00.335) 0:15:33.712 ******* ok: [managed-node4] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/sda1", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-0ebc5dd0-5035-41fd-adcc-ae42ec9792ff", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-0ebc5dd0-5035-41fd-adcc-ae42ec9792ff", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-0ebc5dd0-5035-41fd-adcc-ae42ec9792ff", "password": "/tmp/storage_test45gdlvlukskey", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-0ebc5dd0-5035-41fd-adcc-ae42ec9792ff" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=8147f929-8743-457e-91a7-9bb117717796", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-0ebc5dd0-5035-41fd-adcc-ae42ec9792ff", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "e2fsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-0ebc5dd0-5035-41fd-adcc-ae42ec9792ff", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-0ebc5dd0-5035-41fd-adcc-ae42ec9792ff", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_test45gdlvlukskey", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:130 Wednesday 18 March 2026 19:54:10 -0400 (0:00:00.441) 0:15:34.154 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-0ebc5dd0-5035-41fd-adcc-ae42ec9792ff", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-0ebc5dd0-5035-41fd-adcc-ae42ec9792ff", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_test45gdlvlukskey", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:134 Wednesday 18 March 2026 19:54:10 -0400 (0:00:00.266) 0:15:34.420 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:150 Wednesday 18 March 2026 19:54:10 -0400 (0:00:00.262) 0:15:34.683 ******* changed: [managed-node4] => (item={u'src': u'UUID=8147f929-8743-457e-91a7-9bb117717796', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=8147f929-8743-457e-91a7-9bb117717796", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=8147f929-8743-457e-91a7-9bb117717796" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Wednesday 18 March 2026 19:54:12 -0400 (0:00:01.625) 0:15:36.308 ******* ok: [managed-node4] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166 Wednesday 18 March 2026 19:54:14 -0400 (0:00:02.081) 0:15:38.390 ******* changed: [managed-node4] => (item={u'src': u'/dev/mapper/luks-0ebc5dd0-5035-41fd-adcc-ae42ec9792ff', u'group': None, u'dump': 0, u'passno': 0, u'fstype': u'xfs', u'state': u'mounted', u'mode': None, u'owner': None, u'path': u'/opt/test1', u'opts': u'defaults'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-0ebc5dd0-5035-41fd-adcc-ae42ec9792ff", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-0ebc5dd0-5035-41fd-adcc-ae42ec9792ff" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:177 Wednesday 18 March 2026 19:54:15 -0400 (0:00:01.659) 0:15:40.049 ******* skipping: [managed-node4] => (item={u'src': u'/dev/mapper/luks-0ebc5dd0-5035-41fd-adcc-ae42ec9792ff', u'group': None, u'dump': 0, u'passno': 0, u'fstype': u'xfs', u'state': u'mounted', u'mode': None, u'owner': None, u'path': u'/opt/test1', u'opts': u'defaults'}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-0ebc5dd0-5035-41fd-adcc-ae42ec9792ff", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:189 Wednesday 18 March 2026 19:54:16 -0400 (0:00:00.465) 0:15:40.514 ******* ok: [managed-node4] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 Wednesday 18 March 2026 19:54:18 -0400 (0:00:01.743) 0:15:42.258 ******* ok: [managed-node4] => { "changed": false, "stat": { "atime": 1773877893.1493351, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1773877883.1383066, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 264033, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1773877883.1383066, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "18446744071680135912", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:202 Wednesday 18 March 2026 19:54:20 -0400 (0:00:01.941) 0:15:44.200 ******* changed: [managed-node4] => (item={u'state': u'present', u'password': u'/tmp/storage_test45gdlvlukskey', u'name': u'luks-0ebc5dd0-5035-41fd-adcc-ae42ec9792ff', u'backing_device': u'/dev/sda1'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda1", "name": "luks-0ebc5dd0-5035-41fd-adcc-ae42ec9792ff", "password": "/tmp/storage_test45gdlvlukskey", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 Wednesday 18 March 2026 19:54:21 -0400 (0:00:01.743) 0:15:45.943 ******* ok: [managed-node4] TASK [Verify role results - 6] ************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:336 Wednesday 18 March 2026 19:54:24 -0400 (0:00:02.504) 0:15:48.447 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node4 TASK [Print out pool information] ********************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Wednesday 18 March 2026 19:54:24 -0400 (0:00:00.378) 0:15:48.911 ******* ok: [managed-node4] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-0ebc5dd0-5035-41fd-adcc-ae42ec9792ff", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-0ebc5dd0-5035-41fd-adcc-ae42ec9792ff", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_test45gdlvlukskey", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Wednesday 18 March 2026 19:54:25 -0400 (0:00:00.454) 0:15:49.365 ******* skipping: [managed-node4] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Wednesday 18 March 2026 19:54:25 -0400 (0:00:00.390) 0:15:49.756 ******* ok: [managed-node4] => { "changed": false, "info": { "/dev/mapper/luks-0ebc5dd0-5035-41fd-adcc-ae42ec9792ff": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-0ebc5dd0-5035-41fd-adcc-ae42ec9792ff", "size": "4G", "type": "crypt", "uuid": "6992ada5-d219-48ee-be71-1482be5c5d7d" }, "/dev/sda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/sda1", "size": "4G", "type": "partition", "uuid": "0ebc5dd0-5035-41fd-adcc-ae42ec9792ff" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "ext4", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "c7b7d6a5-fd01-4b9b-bcca-153eaff9d312" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Wednesday 18 March 2026 19:54:27 -0400 (0:00:01.816) 0:15:51.572 ******* ok: [managed-node4] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002594", "end": "2026-03-18 19:54:28.914612", "rc": 0, "start": "2026-03-18 19:54:28.912018" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Thu Jun 20 10:23:46 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk' # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info # UUID=c7b7d6a5-fd01-4b9b-bcca-153eaff9d312 / ext4 defaults 1 1 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-0ebc5dd0-5035-41fd-adcc-ae42ec9792ff /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Wednesday 18 March 2026 19:54:29 -0400 (0:00:01.879) 0:15:53.451 ******* ok: [managed-node4] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002637", "end": "2026-03-18 19:54:31.028583", "failed_when_result": false, "rc": 0, "start": "2026-03-18 19:54:31.025946" } STDOUT: luks-0ebc5dd0-5035-41fd-adcc-ae42ec9792ff /dev/sda1 /tmp/storage_test45gdlvlukskey TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Wednesday 18 March 2026 19:54:31 -0400 (0:00:02.012) 0:15:55.464 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node4 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Wednesday 18 March 2026 19:54:31 -0400 (0:00:00.617) 0:15:56.081 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Wednesday 18 March 2026 19:54:32 -0400 (0:00:00.356) 0:15:56.438 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Wednesday 18 March 2026 19:54:32 -0400 (0:00:00.389) 0:15:56.828 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Wednesday 18 March 2026 19:54:33 -0400 (0:00:00.344) 0:15:57.172 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node4 included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node4 TASK [Set test variables] ****************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Wednesday 18 March 2026 19:54:34 -0400 (0:00:01.010) 0:15:58.183 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Wednesday 18 March 2026 19:54:34 -0400 (0:00:00.284) 0:15:58.467 ******* TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Wednesday 18 March 2026 19:54:34 -0400 (0:00:00.234) 0:15:58.702 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Wednesday 18 March 2026 19:54:34 -0400 (0:00:00.273) 0:15:58.976 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Wednesday 18 March 2026 19:54:35 -0400 (0:00:00.359) 0:15:59.335 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Wednesday 18 March 2026 19:54:36 -0400 (0:00:01.572) 0:16:00.908 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type - 2] ************************************************ task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Wednesday 18 March 2026 19:54:37 -0400 (0:00:00.290) 0:16:01.199 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type - 3] ************************************************ task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Wednesday 18 March 2026 19:54:37 -0400 (0:00:00.279) 0:16:01.479 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:55 Wednesday 18 March 2026 19:54:37 -0400 (0:00:00.319) 0:16:01.799 ******* TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:68 Wednesday 18 March 2026 19:54:38 -0400 (0:00:00.394) 0:16:02.193 ******* ok: [managed-node4] => { "changed": false, "failed_when_result": false, "rc": 1 } STDERR: Shared connection to 10.31.44.104 closed. MSG: non-zero return code TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:78 Wednesday 18 March 2026 19:54:40 -0400 (0:00:02.242) 0:16:04.435 ******* TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:88 Wednesday 18 March 2026 19:54:40 -0400 (0:00:00.387) 0:16:04.823 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node4 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Wednesday 18 March 2026 19:54:41 -0400 (0:00:00.672) 0:16:05.496 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Wednesday 18 March 2026 19:54:42 -0400 (0:00:00.627) 0:16:06.124 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Wednesday 18 March 2026 19:54:42 -0400 (0:00:00.279) 0:16:06.403 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Wednesday 18 March 2026 19:54:42 -0400 (0:00:00.360) 0:16:06.764 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Wednesday 18 March 2026 19:54:42 -0400 (0:00:00.258) 0:16:07.022 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Wednesday 18 March 2026 19:54:43 -0400 (0:00:00.347) 0:16:07.370 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Wednesday 18 March 2026 19:54:43 -0400 (0:00:00.412) 0:16:07.783 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Wednesday 18 March 2026 19:54:44 -0400 (0:00:00.433) 0:16:08.216 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Wednesday 18 March 2026 19:54:44 -0400 (0:00:00.408) 0:16:08.625 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Wednesday 18 March 2026 19:54:44 -0400 (0:00:00.388) 0:16:09.014 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Wednesday 18 March 2026 19:54:45 -0400 (0:00:00.433) 0:16:09.447 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:91 Wednesday 18 March 2026 19:54:45 -0400 (0:00:00.334) 0:16:09.782 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node4 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Wednesday 18 March 2026 19:54:46 -0400 (0:00:00.869) 0:16:10.652 ******* skipping: [managed-node4] => (item={u'_raw_device': u'/dev/sda1', u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'xfs', u'mount_options': u'defaults', u'_device': u'/dev/mapper/luks-0ebc5dd0-5035-41fd-adcc-ae42ec9792ff', u'size': u'4g', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': None, u'_kernel_device': u'/dev/dm-0', u'encryption': True, u'raid_level': None, u'name': u'test1', u'raid_device_count': None, u'state': u'present', u'part_type': None, u'mount_mode': None, u'thin_pool_name': None, u'type': u'partition', u'encryption_cipher': None, u'vdo_pool_size': None, u'encryption_key_size': None, u'encryption_key': u'/tmp/storage_test45gdlvlukskey', u'fs_label': u'', u'encryption_luks_version': None, u'raid_stripe_size': None, u'mount_passno': 0, u'_mount_id': u'/dev/mapper/luks-0ebc5dd0-5035-41fd-adcc-ae42ec9792ff', u'mount_user': None, u'raid_spare_count': None, u'raid_disks': [], u'_raw_kernel_device': u'/dev/sda1', u'cache_mode': None, u'cache_devices': [], u'deduplication': None, u'mount_group': None, u'thin_pool_size': None, u'disks': [u'sda'], u'cached': False, u'thin': False, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'fs_overwrite_existing': True, u'fs_create_options': u''}) => { "ansible_loop_var": "storage_test_lvmraid_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_lvmraid_volume": { "_device": "/dev/mapper/luks-0ebc5dd0-5035-41fd-adcc-ae42ec9792ff", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-0ebc5dd0-5035-41fd-adcc-ae42ec9792ff", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_test45gdlvlukskey", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:94 Wednesday 18 March 2026 19:54:47 -0400 (0:00:00.730) 0:16:11.383 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node4 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Wednesday 18 March 2026 19:54:48 -0400 (0:00:01.063) 0:16:12.446 ******* skipping: [managed-node4] => (item={u'_raw_device': u'/dev/sda1', u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'xfs', u'mount_options': u'defaults', u'_device': u'/dev/mapper/luks-0ebc5dd0-5035-41fd-adcc-ae42ec9792ff', u'size': u'4g', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': None, u'_kernel_device': u'/dev/dm-0', u'encryption': True, u'raid_level': None, u'name': u'test1', u'raid_device_count': None, u'state': u'present', u'part_type': None, u'mount_mode': None, u'thin_pool_name': None, u'type': u'partition', u'encryption_cipher': None, u'vdo_pool_size': None, u'encryption_key_size': None, u'encryption_key': u'/tmp/storage_test45gdlvlukskey', u'fs_label': u'', u'encryption_luks_version': None, u'raid_stripe_size': None, u'mount_passno': 0, u'_mount_id': u'/dev/mapper/luks-0ebc5dd0-5035-41fd-adcc-ae42ec9792ff', u'mount_user': None, u'raid_spare_count': None, u'raid_disks': [], u'_raw_kernel_device': u'/dev/sda1', u'cache_mode': None, u'cache_devices': [], u'deduplication': None, u'mount_group': None, u'thin_pool_size': None, u'disks': [u'sda'], u'cached': False, u'thin': False, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'fs_overwrite_existing': True, u'fs_create_options': u''}) => { "ansible_loop_var": "storage_test_thin_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_thin_volume": { "_device": "/dev/mapper/luks-0ebc5dd0-5035-41fd-adcc-ae42ec9792ff", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-0ebc5dd0-5035-41fd-adcc-ae42ec9792ff", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_test45gdlvlukskey", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check member encryption] ************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:97 Wednesday 18 March 2026 19:54:48 -0400 (0:00:00.492) 0:16:12.938 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node4 TASK [Set test variables] ****************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Wednesday 18 March 2026 19:54:49 -0400 (0:00:00.916) 0:16:13.855 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Wednesday 18 March 2026 19:54:50 -0400 (0:00:00.458) 0:16:14.314 ******* TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Wednesday 18 March 2026 19:54:50 -0400 (0:00:00.156) 0:16:14.470 ******* TASK [Clear test variables] **************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Wednesday 18 March 2026 19:54:50 -0400 (0:00:00.274) 0:16:14.744 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:100 Wednesday 18 March 2026 19:54:51 -0400 (0:00:00.358) 0:16:15.103 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node4 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Wednesday 18 March 2026 19:54:51 -0400 (0:00:00.873) 0:16:15.976 ******* skipping: [managed-node4] => (item={u'_raw_device': u'/dev/sda1', u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'xfs', u'mount_options': u'defaults', u'_device': u'/dev/mapper/luks-0ebc5dd0-5035-41fd-adcc-ae42ec9792ff', u'size': u'4g', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': None, u'_kernel_device': u'/dev/dm-0', u'encryption': True, u'raid_level': None, u'name': u'test1', u'raid_device_count': None, u'state': u'present', u'part_type': None, u'mount_mode': None, u'thin_pool_name': None, u'type': u'partition', u'encryption_cipher': None, u'vdo_pool_size': None, u'encryption_key_size': None, u'encryption_key': u'/tmp/storage_test45gdlvlukskey', u'fs_label': u'', u'encryption_luks_version': None, u'raid_stripe_size': None, u'mount_passno': 0, u'_mount_id': u'/dev/mapper/luks-0ebc5dd0-5035-41fd-adcc-ae42ec9792ff', u'mount_user': None, u'raid_spare_count': None, u'raid_disks': [], u'_raw_kernel_device': u'/dev/sda1', u'cache_mode': None, u'cache_devices': [], u'deduplication': None, u'mount_group': None, u'thin_pool_size': None, u'disks': [u'sda'], u'cached': False, u'thin': False, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'fs_overwrite_existing': True, u'fs_create_options': u''}) => { "ansible_loop_var": "storage_test_vdo_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_vdo_volume": { "_device": "/dev/mapper/luks-0ebc5dd0-5035-41fd-adcc-ae42ec9792ff", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-0ebc5dd0-5035-41fd-adcc-ae42ec9792ff", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_test45gdlvlukskey", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:103 Wednesday 18 March 2026 19:54:52 -0400 (0:00:00.445) 0:16:16.422 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node4 TASK [Get stratis pool information] ******************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Wednesday 18 March 2026 19:54:53 -0400 (0:00:01.228) 0:16:17.650 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print script output] ***************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Wednesday 18 March 2026 19:54:53 -0400 (0:00:00.416) 0:16:18.066 ******* skipping: [managed-node4] => {} TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:19 Wednesday 18 March 2026 19:54:54 -0400 (0:00:00.431) 0:16:18.498 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:23 Wednesday 18 March 2026 19:54:54 -0400 (0:00:00.379) 0:16:18.877 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:30 Wednesday 18 March 2026 19:54:55 -0400 (0:00:00.328) 0:16:19.205 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:39 Wednesday 18 March 2026 19:54:55 -0400 (0:00:00.302) 0:16:19.508 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:49 Wednesday 18 March 2026 19:54:55 -0400 (0:00:00.315) 0:16:19.823 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:106 Wednesday 18 March 2026 19:54:56 -0400 (0:00:00.495) 0:16:20.319 ******* ok: [managed-node4] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Wednesday 18 March 2026 19:54:56 -0400 (0:00:00.313) 0:16:20.632 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node4 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Wednesday 18 March 2026 19:54:57 -0400 (0:00:00.515) 0:16:21.148 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Wednesday 18 March 2026 19:54:57 -0400 (0:00:00.479) 0:16:21.628 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node4 included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node4 included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node4 included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node4 included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node4 included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node4 included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node4 included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node4 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Wednesday 18 March 2026 19:54:59 -0400 (0:00:02.238) 0:16:23.866 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-0ebc5dd0-5035-41fd-adcc-ae42ec9792ff" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Wednesday 18 March 2026 19:55:00 -0400 (0:00:00.462) 0:16:24.328 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Wednesday 18 March 2026 19:55:00 -0400 (0:00:00.537) 0:16:24.865 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Wednesday 18 March 2026 19:55:01 -0400 (0:00:00.479) 0:16:25.345 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Wednesday 18 March 2026 19:55:01 -0400 (0:00:00.461) 0:16:25.806 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Wednesday 18 March 2026 19:55:02 -0400 (0:00:00.473) 0:16:26.279 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Wednesday 18 March 2026 19:55:02 -0400 (0:00:00.460) 0:16:26.740 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Wednesday 18 March 2026 19:55:03 -0400 (0:00:00.390) 0:16:27.131 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Wednesday 18 March 2026 19:55:03 -0400 (0:00:00.346) 0:16:27.477 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Wednesday 18 March 2026 19:55:03 -0400 (0:00:00.339) 0:16:27.816 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Wednesday 18 March 2026 19:55:04 -0400 (0:00:00.469) 0:16:28.285 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Wednesday 18 March 2026 19:55:04 -0400 (0:00:00.290) 0:16:28.576 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-0ebc5dd0-5035-41fd-adcc-ae42ec9792ff " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Wednesday 18 March 2026 19:55:05 -0400 (0:00:00.617) 0:16:29.193 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Wednesday 18 March 2026 19:55:05 -0400 (0:00:00.500) 0:16:29.693 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Wednesday 18 March 2026 19:55:06 -0400 (0:00:00.457) 0:16:30.151 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Wednesday 18 March 2026 19:55:06 -0400 (0:00:00.393) 0:16:30.544 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Wednesday 18 March 2026 19:55:06 -0400 (0:00:00.524) 0:16:31.069 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Wednesday 18 March 2026 19:55:07 -0400 (0:00:00.334) 0:16:31.404 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Wednesday 18 March 2026 19:55:07 -0400 (0:00:00.512) 0:16:31.916 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Wednesday 18 March 2026 19:55:08 -0400 (0:00:00.497) 0:16:32.413 ******* ok: [managed-node4] => { "changed": false, "stat": { "atime": 1773878045.5597692, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1773878045.5597692, "dev": 5, "device_type": 2049, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 157557, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1773878045.5597692, "nlink": 1, "path": "/dev/sda1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Wednesday 18 March 2026 19:55:10 -0400 (0:00:02.190) 0:16:34.603 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Wednesday 18 March 2026 19:55:11 -0400 (0:00:00.592) 0:16:35.196 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Wednesday 18 March 2026 19:55:11 -0400 (0:00:00.337) 0:16:35.533 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Wednesday 18 March 2026 19:55:11 -0400 (0:00:00.349) 0:16:35.882 ******* ok: [managed-node4] => { "ansible_facts": { "st_volume_type": "partition" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Wednesday 18 March 2026 19:55:12 -0400 (0:00:00.335) 0:16:36.218 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Wednesday 18 March 2026 19:55:12 -0400 (0:00:00.312) 0:16:36.531 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Wednesday 18 March 2026 19:55:12 -0400 (0:00:00.414) 0:16:36.945 ******* ok: [managed-node4] => { "changed": false, "stat": { "atime": 1773878045.6687696, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1773878045.6687696, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 157589, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1773878045.6687696, "nlink": 1, "path": "/dev/mapper/luks-0ebc5dd0-5035-41fd-adcc-ae42ec9792ff", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Wednesday 18 March 2026 19:55:14 -0400 (0:00:01.832) 0:16:38.778 ******* ok: [managed-node4] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed" ] } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Wednesday 18 March 2026 19:55:18 -0400 (0:00:03.954) 0:16:42.733 ******* ok: [managed-node4] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/sda1" ], "delta": "0:00:00.025730", "end": "2026-03-18 19:55:20.052381", "rc": 0, "start": "2026-03-18 19:55:20.026651" } STDOUT: LUKS header information for /dev/sda1 Version: 1 Cipher name: aes Cipher mode: xts-plain64 Hash spec: sha256 Payload offset: 8192 MK bits: 512 MK digest: 3f 77 4b 9f 62 ad 56 5d a9 d0 53 2c 07 f5 38 f9 b0 94 56 d5 MK salt: 63 d9 23 20 ba ab 0e ac d7 d4 70 1b 59 a1 bb f9 24 35 d2 c5 c1 44 c6 a2 e7 3c 79 b5 98 ff b5 35 MK iterations: 22882 UUID: 0ebc5dd0-5035-41fd-adcc-ae42ec9792ff Key Slot 0: ENABLED Iterations: 367148 Salt: dc 52 ed e3 c2 cb dd 24 52 cf e3 18 de 96 87 71 95 6d 70 fc 0c df 63 4d 4b 81 0d 7b 84 06 9c 91 Key material offset: 8 AF stripes: 4000 Key Slot 1: DISABLED Key Slot 2: DISABLED Key Slot 3: DISABLED Key Slot 4: DISABLED Key Slot 5: DISABLED Key Slot 6: DISABLED Key Slot 7: DISABLED TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Wednesday 18 March 2026 19:55:20 -0400 (0:00:01.721) 0:16:44.455 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Wednesday 18 March 2026 19:55:20 -0400 (0:00:00.373) 0:16:44.828 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Wednesday 18 March 2026 19:55:21 -0400 (0:00:00.435) 0:16:45.264 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Wednesday 18 March 2026 19:55:21 -0400 (0:00:00.401) 0:16:45.665 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Wednesday 18 March 2026 19:55:22 -0400 (0:00:00.444) 0:16:46.110 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Wednesday 18 March 2026 19:55:22 -0400 (0:00:00.265) 0:16:46.376 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Wednesday 18 March 2026 19:55:22 -0400 (0:00:00.241) 0:16:46.618 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Wednesday 18 March 2026 19:55:22 -0400 (0:00:00.357) 0:16:46.976 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-0ebc5dd0-5035-41fd-adcc-ae42ec9792ff /dev/sda1 /tmp/storage_test45gdlvlukskey" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "/tmp/storage_test45gdlvlukskey" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Wednesday 18 March 2026 19:55:23 -0400 (0:00:00.284) 0:16:47.260 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Wednesday 18 March 2026 19:55:23 -0400 (0:00:00.456) 0:16:47.716 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Wednesday 18 March 2026 19:55:24 -0400 (0:00:00.433) 0:16:48.149 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Wednesday 18 March 2026 19:55:24 -0400 (0:00:00.437) 0:16:48.587 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Wednesday 18 March 2026 19:55:24 -0400 (0:00:00.382) 0:16:48.969 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Wednesday 18 March 2026 19:55:25 -0400 (0:00:00.314) 0:16:49.284 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Wednesday 18 March 2026 19:55:25 -0400 (0:00:00.318) 0:16:49.603 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Wednesday 18 March 2026 19:55:25 -0400 (0:00:00.302) 0:16:49.905 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Wednesday 18 March 2026 19:55:26 -0400 (0:00:00.289) 0:16:50.195 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Wednesday 18 March 2026 19:55:26 -0400 (0:00:00.325) 0:16:50.520 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Wednesday 18 March 2026 19:55:26 -0400 (0:00:00.257) 0:16:50.778 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Wednesday 18 March 2026 19:55:26 -0400 (0:00:00.256) 0:16:51.034 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Wednesday 18 March 2026 19:55:27 -0400 (0:00:00.238) 0:16:51.273 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Wednesday 18 March 2026 19:55:27 -0400 (0:00:00.284) 0:16:51.558 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Wednesday 18 March 2026 19:55:27 -0400 (0:00:00.249) 0:16:51.807 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Wednesday 18 March 2026 19:55:27 -0400 (0:00:00.245) 0:16:52.052 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Wednesday 18 March 2026 19:55:28 -0400 (0:00:00.195) 0:16:52.248 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Wednesday 18 March 2026 19:55:28 -0400 (0:00:00.279) 0:16:52.528 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Wednesday 18 March 2026 19:55:28 -0400 (0:00:00.166) 0:16:52.694 ******* ok: [managed-node4] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Wednesday 18 March 2026 19:55:28 -0400 (0:00:00.267) 0:16:52.962 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Wednesday 18 March 2026 19:55:29 -0400 (0:00:00.303) 0:16:53.266 ******* skipping: [managed-node4] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Wednesday 18 March 2026 19:55:29 -0400 (0:00:00.258) 0:16:53.524 ******* skipping: [managed-node4] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Wednesday 18 March 2026 19:55:29 -0400 (0:00:00.230) 0:16:53.754 ******* skipping: [managed-node4] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Wednesday 18 March 2026 19:55:29 -0400 (0:00:00.232) 0:16:53.986 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Wednesday 18 March 2026 19:55:30 -0400 (0:00:00.356) 0:16:54.343 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Wednesday 18 March 2026 19:55:30 -0400 (0:00:00.318) 0:16:54.662 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Wednesday 18 March 2026 19:55:30 -0400 (0:00:00.310) 0:16:54.972 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Wednesday 18 March 2026 19:55:31 -0400 (0:00:00.241) 0:16:55.214 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Wednesday 18 March 2026 19:55:31 -0400 (0:00:00.379) 0:16:55.594 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Wednesday 18 March 2026 19:55:31 -0400 (0:00:00.251) 0:16:55.846 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Wednesday 18 March 2026 19:55:32 -0400 (0:00:00.351) 0:16:56.198 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Wednesday 18 March 2026 19:55:32 -0400 (0:00:00.378) 0:16:56.576 ******* skipping: [managed-node4] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Wednesday 18 March 2026 19:55:32 -0400 (0:00:00.359) 0:16:56.936 ******* skipping: [managed-node4] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Wednesday 18 March 2026 19:55:33 -0400 (0:00:00.352) 0:16:57.288 ******* skipping: [managed-node4] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Wednesday 18 March 2026 19:55:33 -0400 (0:00:00.298) 0:16:57.587 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Wednesday 18 March 2026 19:55:33 -0400 (0:00:00.406) 0:16:57.993 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Wednesday 18 March 2026 19:55:34 -0400 (0:00:00.344) 0:16:58.338 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Wednesday 18 March 2026 19:55:34 -0400 (0:00:00.280) 0:16:58.618 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Wednesday 18 March 2026 19:55:34 -0400 (0:00:00.269) 0:16:58.888 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Wednesday 18 March 2026 19:55:35 -0400 (0:00:00.287) 0:16:59.175 ******* ok: [managed-node4] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Wednesday 18 March 2026 19:55:36 -0400 (0:00:01.438) 0:17:00.613 ******* ok: [managed-node4] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Wednesday 18 March 2026 19:55:36 -0400 (0:00:00.462) 0:17:01.076 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Wednesday 18 March 2026 19:55:37 -0400 (0:00:00.361) 0:17:01.438 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Wednesday 18 March 2026 19:55:37 -0400 (0:00:00.343) 0:17:01.782 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Wednesday 18 March 2026 19:55:38 -0400 (0:00:00.322) 0:17:02.104 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Wednesday 18 March 2026 19:55:38 -0400 (0:00:00.371) 0:17:02.475 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Wednesday 18 March 2026 19:55:38 -0400 (0:00:00.316) 0:17:02.791 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Wednesday 18 March 2026 19:55:39 -0400 (0:00:00.387) 0:17:03.179 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Wednesday 18 March 2026 19:55:39 -0400 (0:00:00.354) 0:17:03.533 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Wednesday 18 March 2026 19:55:39 -0400 (0:00:00.432) 0:17:03.966 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Wednesday 18 March 2026 19:55:40 -0400 (0:00:00.338) 0:17:04.304 ******* TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Wednesday 18 March 2026 19:55:40 -0400 (0:00:00.415) 0:17:04.719 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Remove the key file] ***************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:342 Wednesday 18 March 2026 19:55:40 -0400 (0:00:00.305) 0:17:05.025 ******* ok: [managed-node4] => { "changed": false, "path": "/tmp/storage_test45gdlvlukskey", "state": "absent" } TASK [Test for correct handling of new encrypted volume w/ no key - 3] ********* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:352 Wednesday 18 March 2026 19:55:42 -0400 (0:00:01.811) 0:17:06.837 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node4 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Wednesday 18 March 2026 19:55:43 -0400 (0:00:00.523) 0:17:07.360 ******* ok: [managed-node4] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Wednesday 18 March 2026 19:55:43 -0400 (0:00:00.398) 0:17:07.759 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node4 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:24 Wednesday 18 March 2026 19:55:44 -0400 (0:00:00.453) 0:17:08.212 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:34 Wednesday 18 March 2026 19:55:44 -0400 (0:00:00.402) 0:17:08.615 ******* TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Wednesday 18 March 2026 19:55:44 -0400 (0:00:00.202) 0:17:08.817 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Wednesday 18 March 2026 19:55:45 -0400 (0:00:00.513) 0:17:09.331 ******* ok: [managed-node4] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Wednesday 18 March 2026 19:55:47 -0400 (0:00:02.629) 0:17:11.961 ******* skipping: [managed-node4] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node4] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node4] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-loop", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed-node4] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Wednesday 18 March 2026 19:55:48 -0400 (0:00:00.779) 0:17:12.740 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Wednesday 18 March 2026 19:55:49 -0400 (0:00:00.380) 0:17:13.120 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Wednesday 18 March 2026 19:55:49 -0400 (0:00:00.354) 0:17:13.475 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Wednesday 18 March 2026 19:55:49 -0400 (0:00:00.276) 0:17:13.752 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Wednesday 18 March 2026 19:55:50 -0400 (0:00:00.366) 0:17:14.118 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Wednesday 18 March 2026 19:55:50 -0400 (0:00:00.788) 0:17:14.907 ******* ok: [managed-node4] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-loop-2.18-5.el7.x86_64 providing libblockdev-loop is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Wednesday 18 March 2026 19:55:54 -0400 (0:00:03.413) 0:17:18.320 ******* ok: [managed-node4] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": true, "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Wednesday 18 March 2026 19:55:54 -0400 (0:00:00.321) 0:17:18.641 ******* ok: [managed-node4] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Wednesday 18 March 2026 19:55:54 -0400 (0:00:00.451) 0:17:19.093 ******* ok: [managed-node4] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup", "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Wednesday 18 March 2026 19:56:00 -0400 (0:00:05.889) 0:17:24.982 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Wednesday 18 March 2026 19:56:01 -0400 (0:00:00.504) 0:17:25.487 ******* TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Wednesday 18 March 2026 19:56:01 -0400 (0:00:00.409) 0:17:25.897 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Wednesday 18 March 2026 19:56:02 -0400 (0:00:00.504) 0:17:26.402 ******* TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Wednesday 18 March 2026 19:56:02 -0400 (0:00:00.442) 0:17:26.844 ******* ok: [managed-node4] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed", "7:lvm2-2.02.187-6.el7_9.5.x86_64 providing lvm2 is already installed", "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Wednesday 18 March 2026 19:56:06 -0400 (0:00:03.439) 0:17:30.283 ******* ok: [managed-node4] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Wednesday 18 March 2026 19:56:09 -0400 (0:00:03.328) 0:17:33.612 ******* TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Wednesday 18 March 2026 19:56:09 -0400 (0:00:00.362) 0:17:33.974 ******* fatal: [managed-node4]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: encrypted volume 'test1' missing key/password TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111 Wednesday 18 March 2026 19:56:16 -0400 (0:00:06.414) 0:17:40.389 ******* fatal: [managed-node4]: FAILED! => { "changed": false } MSG: {u'_ansible_no_log': False, u'crypts': [], u'pools': [], u'leaves': [], u'changed': False, u'actions': [], u'failed': True, u'volumes': [], u'invocation': {u'module_args': {u'packages_only': False, u'uses_kmod_kvdo': True, u'disklabel_type': None, u'safe_mode': False, u'diskvolume_mkfs_option_map': {u'ext4': u'-F', u'ext3': u'-F', u'ext2': u'-F'}, u'pools': [{u'raid_metadata_version': None, u'encryption_key_size': None, u'encryption_key': None, u'encryption_luks_version': None, u'encryption_tang_url': None, u'raid_spare_count': None, u'grow_to_fill': False, u'encryption_tang_thumbprint': None, u'name': u'foo', u'encryption_password': None, u'encryption': False, u'disks': [u'sda'], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'volumes': [{u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': u'4g', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': None, u'encryption': True, u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'part_type': None, u'mount_mode': None, u'thin_pool_name': None, u'type': u'lvm', u'encryption_cipher': None, u'deduplication': None, u'vdo_pool_size': None, u'encryption_key_size': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'raid_stripe_size': None, u'mount_passno': 0, u'mount_user': None, u'raid_spare_count': None, u'name': u'test1', u'cache_mode': None, u'raid_disks': [], u'mount_group': None, u'fs_overwrite_existing': True, u'disks': [], u'cached': False, u'thin_pool_size': None, u'thin': False, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}], u'shared': False, u'encryption_clevis_pin': None, u'type': u'lvm', u'encryption_cipher': None, u'raid_chunk_size': None}], u'volumes': [], u'pool_defaults': {u'raid_metadata_version': None, u'encryption_cipher': None, u'encryption_key': None, u'encryption_luks_version': None, u'raid_spare_count': None, u'grow_to_fill': False, u'encryption_password': None, u'encryption': False, u'disks': [], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'shared': False, u'type': u'lvm', u'encryption_key_size': None, u'raid_chunk_size': None}, u'volume_defaults': {u'raid_metadata_version': None, u'raid_level': None, u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': 0, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'encryption': False, u'mount_device_identifier': u'uuid', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'thin_pool_name': None, u'type': u'lvm', u'encryption_key_size': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'raid_stripe_size': None, u'cache_size': 0, u'raid_spare_count': None, u'cache_mode': None, u'deduplication': None, u'cached': False, u'fs_overwrite_existing': True, u'disks': [], u'thin_pool_size': None, u'thin': None, u'mount_check': 0, u'mount_passno': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}, u'use_partitions': None}}, u'mounts': [], u'packages': [], u'msg': u"encrypted volume 'test1' missing key/password"} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Wednesday 18 March 2026 19:56:16 -0400 (0:00:00.469) 0:17:40.859 ******* TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:22 Wednesday 18 March 2026 19:56:17 -0400 (0:00:00.515) 0:17:41.375 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:27 Wednesday 18 March 2026 19:56:17 -0400 (0:00:00.366) 0:17:41.741 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:38 Wednesday 18 March 2026 19:56:18 -0400 (0:00:00.610) 0:17:42.351 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Create an encrypted lvm volume w/ default fs] **************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:370 Wednesday 18 March 2026 19:56:18 -0400 (0:00:00.393) 0:17:42.744 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node4 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:24 Wednesday 18 March 2026 19:56:19 -0400 (0:00:00.582) 0:17:43.327 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:34 Wednesday 18 March 2026 19:56:19 -0400 (0:00:00.249) 0:17:43.576 ******* TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Wednesday 18 March 2026 19:56:19 -0400 (0:00:00.332) 0:17:43.909 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Wednesday 18 March 2026 19:56:20 -0400 (0:00:00.444) 0:17:44.353 ******* ok: [managed-node4] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Wednesday 18 March 2026 19:56:23 -0400 (0:00:03.080) 0:17:47.434 ******* skipping: [managed-node4] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node4] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node4] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-loop", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed-node4] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Wednesday 18 March 2026 19:56:24 -0400 (0:00:00.805) 0:17:48.239 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Wednesday 18 March 2026 19:56:24 -0400 (0:00:00.354) 0:17:48.594 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Wednesday 18 March 2026 19:56:24 -0400 (0:00:00.300) 0:17:48.894 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Wednesday 18 March 2026 19:56:25 -0400 (0:00:00.250) 0:17:49.145 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Wednesday 18 March 2026 19:56:25 -0400 (0:00:00.252) 0:17:49.397 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Wednesday 18 March 2026 19:56:26 -0400 (0:00:00.773) 0:17:50.171 ******* ok: [managed-node4] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-loop-2.18-5.el7.x86_64 providing libblockdev-loop is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Wednesday 18 March 2026 19:56:29 -0400 (0:00:03.758) 0:17:53.930 ******* ok: [managed-node4] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": true, "encryption_cipher": "aes-xts-plain64", "encryption_key_size": 512, "encryption_luks_version": "luks1", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Wednesday 18 March 2026 19:56:30 -0400 (0:00:00.551) 0:17:54.481 ******* ok: [managed-node4] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Wednesday 18 March 2026 19:56:30 -0400 (0:00:00.419) 0:17:54.900 ******* ok: [managed-node4] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup", "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Wednesday 18 March 2026 19:56:36 -0400 (0:00:05.976) 0:18:00.877 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Wednesday 18 March 2026 19:56:37 -0400 (0:00:00.603) 0:18:01.480 ******* TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Wednesday 18 March 2026 19:56:37 -0400 (0:00:00.269) 0:18:01.749 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Wednesday 18 March 2026 19:56:38 -0400 (0:00:00.384) 0:18:02.134 ******* TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Wednesday 18 March 2026 19:56:38 -0400 (0:00:00.357) 0:18:02.491 ******* ok: [managed-node4] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed", "7:lvm2-2.02.187-6.el7_9.5.x86_64 providing lvm2 is already installed", "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Wednesday 18 March 2026 19:56:41 -0400 (0:00:03.518) 0:18:06.009 ******* ok: [managed-node4] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Wednesday 18 March 2026 19:56:45 -0400 (0:00:03.248) 0:18:09.258 ******* TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Wednesday 18 March 2026 19:56:45 -0400 (0:00:00.386) 0:18:09.645 ******* changed: [managed-node4] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-0ebc5dd0-5035-41fd-adcc-ae42ec9792ff", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-0ebc5dd0-5035-41fd-adcc-ae42ec9792ff", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "destroy device", "device": "/dev/sda1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create format", "device": "/dev/sda", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/foo", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-64ca6b26-d808-4080-846a-f5a544a68787", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-64ca6b26-d808-4080-846a-f5a544a68787", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-0ebc5dd0-5035-41fd-adcc-ae42ec9792ff", "password": "-", "state": "absent" }, { "backing_device": "/dev/mapper/foo-test1", "name": "luks-64ca6b26-d808-4080-846a-f5a544a68787", "password": "-", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-64ca6b26-d808-4080-846a-f5a544a68787" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-0ebc5dd0-5035-41fd-adcc-ae42ec9792ff", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-64ca6b26-d808-4080-846a-f5a544a68787", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "e2fsprogs", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-64ca6b26-d808-4080-846a-f5a544a68787", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-64ca6b26-d808-4080-846a-f5a544a68787", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": "aes-xts-plain64", "encryption_key": null, "encryption_key_size": 512, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 Wednesday 18 March 2026 19:56:58 -0400 (0:00:12.744) 0:18:22.389 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Wednesday 18 March 2026 19:56:58 -0400 (0:00:00.398) 0:18:22.788 ******* ok: [managed-node4] => { "changed": false, "stat": { "atime": 1773878055.5927978, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "765263a88349cd53d6f34340cf96dca874ee4fc7", "ctime": 1773878055.590798, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263969, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1773878055.590798, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1311, "uid": 0, "version": "18446744071680134064", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:97 Wednesday 18 March 2026 19:57:00 -0400 (0:00:02.135) 0:18:24.924 ******* ok: [managed-node4] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Wednesday 18 March 2026 19:57:02 -0400 (0:00:02.078) 0:18:27.003 ******* TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Wednesday 18 March 2026 19:57:03 -0400 (0:00:00.428) 0:18:27.431 ******* ok: [managed-node4] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-0ebc5dd0-5035-41fd-adcc-ae42ec9792ff", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-0ebc5dd0-5035-41fd-adcc-ae42ec9792ff", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "destroy device", "device": "/dev/sda1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create format", "device": "/dev/sda", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/foo", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-64ca6b26-d808-4080-846a-f5a544a68787", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-64ca6b26-d808-4080-846a-f5a544a68787", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-0ebc5dd0-5035-41fd-adcc-ae42ec9792ff", "password": "-", "state": "absent" }, { "backing_device": "/dev/mapper/foo-test1", "name": "luks-64ca6b26-d808-4080-846a-f5a544a68787", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-64ca6b26-d808-4080-846a-f5a544a68787" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-0ebc5dd0-5035-41fd-adcc-ae42ec9792ff", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-64ca6b26-d808-4080-846a-f5a544a68787", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "e2fsprogs", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-64ca6b26-d808-4080-846a-f5a544a68787", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-64ca6b26-d808-4080-846a-f5a544a68787", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": "aes-xts-plain64", "encryption_key": null, "encryption_key_size": 512, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:130 Wednesday 18 March 2026 19:57:03 -0400 (0:00:00.432) 0:18:27.864 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-64ca6b26-d808-4080-846a-f5a544a68787", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-64ca6b26-d808-4080-846a-f5a544a68787", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": "aes-xts-plain64", "encryption_key": null, "encryption_key_size": 512, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:134 Wednesday 18 March 2026 19:57:04 -0400 (0:00:00.534) 0:18:28.399 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:150 Wednesday 18 March 2026 19:57:04 -0400 (0:00:00.296) 0:18:28.696 ******* changed: [managed-node4] => (item={u'src': u'/dev/mapper/luks-0ebc5dd0-5035-41fd-adcc-ae42ec9792ff', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-0ebc5dd0-5035-41fd-adcc-ae42ec9792ff", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-0ebc5dd0-5035-41fd-adcc-ae42ec9792ff" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Wednesday 18 March 2026 19:57:06 -0400 (0:00:01.992) 0:18:30.688 ******* ok: [managed-node4] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166 Wednesday 18 March 2026 19:57:08 -0400 (0:00:02.086) 0:18:32.775 ******* changed: [managed-node4] => (item={u'src': u'/dev/mapper/luks-64ca6b26-d808-4080-846a-f5a544a68787', u'group': None, u'dump': 0, u'passno': 0, u'fstype': u'xfs', u'state': u'mounted', u'mode': None, u'owner': None, u'path': u'/opt/test1', u'opts': u'defaults'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-64ca6b26-d808-4080-846a-f5a544a68787", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-64ca6b26-d808-4080-846a-f5a544a68787" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:177 Wednesday 18 March 2026 19:57:10 -0400 (0:00:02.051) 0:18:34.827 ******* skipping: [managed-node4] => (item={u'src': u'/dev/mapper/luks-64ca6b26-d808-4080-846a-f5a544a68787', u'group': None, u'dump': 0, u'passno': 0, u'fstype': u'xfs', u'state': u'mounted', u'mode': None, u'owner': None, u'path': u'/opt/test1', u'opts': u'defaults'}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-64ca6b26-d808-4080-846a-f5a544a68787", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:189 Wednesday 18 March 2026 19:57:11 -0400 (0:00:00.505) 0:18:35.332 ******* ok: [managed-node4] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 Wednesday 18 March 2026 19:57:13 -0400 (0:00:02.520) 0:18:37.853 ******* ok: [managed-node4] => { "changed": false, "stat": { "atime": 1773878071.0268419, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "ae161d8c7cc74adb1c20e0b7a78dcf6c5fdf55b8", "ctime": 1773878061.4388146, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 264034, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1773878061.4378145, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 83, "uid": 0, "version": "18446744071680136102", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:202 Wednesday 18 March 2026 19:57:15 -0400 (0:00:02.052) 0:18:39.906 ******* changed: [managed-node4] => (item={u'state': u'absent', u'password': u'-', u'name': u'luks-0ebc5dd0-5035-41fd-adcc-ae42ec9792ff', u'backing_device': u'/dev/sda1'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda1", "name": "luks-0ebc5dd0-5035-41fd-adcc-ae42ec9792ff", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed changed: [managed-node4] => (item={u'state': u'present', u'password': u'-', u'name': u'luks-64ca6b26-d808-4080-846a-f5a544a68787', u'backing_device': u'/dev/mapper/foo-test1'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/mapper/foo-test1", "name": "luks-64ca6b26-d808-4080-846a-f5a544a68787", "password": "-", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 Wednesday 18 March 2026 19:57:19 -0400 (0:00:03.734) 0:18:43.641 ******* ok: [managed-node4] TASK [Verify role results - 7] ************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:388 Wednesday 18 March 2026 19:57:22 -0400 (0:00:02.810) 0:18:46.451 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node4 TASK [Print out pool information] ********************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Wednesday 18 March 2026 19:57:24 -0400 (0:00:01.753) 0:18:48.205 ******* ok: [managed-node4] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-64ca6b26-d808-4080-846a-f5a544a68787", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-64ca6b26-d808-4080-846a-f5a544a68787", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": "aes-xts-plain64", "encryption_key": null, "encryption_key_size": 512, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Wednesday 18 March 2026 19:57:24 -0400 (0:00:00.494) 0:18:48.699 ******* skipping: [managed-node4] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Wednesday 18 March 2026 19:57:24 -0400 (0:00:00.348) 0:18:49.048 ******* ok: [managed-node4] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "64ca6b26-d808-4080-846a-f5a544a68787" }, "/dev/mapper/luks-64ca6b26-d808-4080-846a-f5a544a68787": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-64ca6b26-d808-4080-846a-f5a544a68787", "size": "4G", "type": "crypt", "uuid": "f0d18298-00c4-4619-b1a0-0f28d9fd0c9d" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "qrc5qj-1or5-aUH1-hq0R-233J-AQ7d-XEcSEK" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "ext4", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "c7b7d6a5-fd01-4b9b-bcca-153eaff9d312" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Wednesday 18 March 2026 19:57:26 -0400 (0:00:01.470) 0:18:50.518 ******* ok: [managed-node4] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002541", "end": "2026-03-18 19:57:28.087676", "rc": 0, "start": "2026-03-18 19:57:28.085135" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Thu Jun 20 10:23:46 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk' # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info # UUID=c7b7d6a5-fd01-4b9b-bcca-153eaff9d312 / ext4 defaults 1 1 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-64ca6b26-d808-4080-846a-f5a544a68787 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Wednesday 18 March 2026 19:57:28 -0400 (0:00:01.953) 0:18:52.472 ******* ok: [managed-node4] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002552", "end": "2026-03-18 19:57:29.823305", "failed_when_result": false, "rc": 0, "start": "2026-03-18 19:57:29.820753" } STDOUT: luks-64ca6b26-d808-4080-846a-f5a544a68787 /dev/mapper/foo-test1 - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Wednesday 18 March 2026 19:57:30 -0400 (0:00:01.792) 0:18:54.265 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node4 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Wednesday 18 March 2026 19:57:30 -0400 (0:00:00.701) 0:18:54.966 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Wednesday 18 March 2026 19:57:31 -0400 (0:00:00.405) 0:18:55.372 ******* ok: [managed-node4] => { "changed": false, "cmd": [ "vgs", "--noheadings", "--binary", "-o", "shared", "foo" ], "delta": "0:00:00.018668", "end": "2026-03-18 19:57:32.769751", "rc": 0, "start": "2026-03-18 19:57:32.751083" } STDOUT: 0 TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Wednesday 18 March 2026 19:57:33 -0400 (0:00:01.861) 0:18:57.233 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Wednesday 18 March 2026 19:57:33 -0400 (0:00:00.490) 0:18:57.724 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node4 included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node4 TASK [Set test variables] ****************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Wednesday 18 March 2026 19:57:34 -0400 (0:00:00.789) 0:18:58.513 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Wednesday 18 March 2026 19:57:34 -0400 (0:00:00.555) 0:18:59.069 ******* ok: [managed-node4] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Wednesday 18 March 2026 19:57:40 -0400 (0:00:05.072) 0:19:04.141 ******* ok: [managed-node4] => { "ansible_facts": { "__pvs_lvm_len": "1" }, "changed": false } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Wednesday 18 March 2026 19:57:40 -0400 (0:00:00.310) 0:19:04.452 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "changed": false } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Wednesday 18 March 2026 19:57:40 -0400 (0:00:00.375) 0:19:04.828 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Set expected pv type] **************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Wednesday 18 March 2026 19:57:41 -0400 (0:00:00.355) 0:19:05.184 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type - 2] ************************************************ task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Wednesday 18 March 2026 19:57:41 -0400 (0:00:00.305) 0:19:05.489 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type - 3] ************************************************ task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Wednesday 18 March 2026 19:57:41 -0400 (0:00:00.311) 0:19:05.801 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:55 Wednesday 18 March 2026 19:57:41 -0400 (0:00:00.263) 0:19:06.065 ******* ok: [managed-node4] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:68 Wednesday 18 March 2026 19:57:42 -0400 (0:00:00.462) 0:19:06.528 ******* ok: [managed-node4] => { "changed": false, "failed_when_result": false, "rc": 1 } STDERR: Shared connection to 10.31.44.104 closed. MSG: non-zero return code TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:78 Wednesday 18 March 2026 19:57:43 -0400 (0:00:01.483) 0:19:08.011 ******* skipping: [managed-node4] => (item=/dev/sda) => { "ansible_loop_var": "st_pool_pv", "changed": false, "skip_reason": "Conditional result was False", "st_pool_pv": "/dev/sda" } TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:88 Wednesday 18 March 2026 19:57:44 -0400 (0:00:00.341) 0:19:08.353 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node4 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Wednesday 18 March 2026 19:57:45 -0400 (0:00:00.837) 0:19:09.191 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Wednesday 18 March 2026 19:57:45 -0400 (0:00:00.313) 0:19:09.505 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Wednesday 18 March 2026 19:57:45 -0400 (0:00:00.310) 0:19:09.816 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Wednesday 18 March 2026 19:57:46 -0400 (0:00:00.342) 0:19:10.158 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Wednesday 18 March 2026 19:57:46 -0400 (0:00:00.340) 0:19:10.498 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Wednesday 18 March 2026 19:57:46 -0400 (0:00:00.320) 0:19:10.819 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Wednesday 18 March 2026 19:57:47 -0400 (0:00:00.350) 0:19:11.170 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Wednesday 18 March 2026 19:57:47 -0400 (0:00:00.407) 0:19:11.577 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Wednesday 18 March 2026 19:57:47 -0400 (0:00:00.248) 0:19:11.825 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Wednesday 18 March 2026 19:57:47 -0400 (0:00:00.193) 0:19:12.019 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Wednesday 18 March 2026 19:57:48 -0400 (0:00:00.239) 0:19:12.258 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:91 Wednesday 18 March 2026 19:57:48 -0400 (0:00:00.286) 0:19:12.544 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node4 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Wednesday 18 March 2026 19:57:48 -0400 (0:00:00.404) 0:19:12.949 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml for managed-node4 TASK [Get information about the LV] ******************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:8 Wednesday 18 March 2026 19:57:49 -0400 (0:00:00.534) 0:19:13.483 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:16 Wednesday 18 March 2026 19:57:49 -0400 (0:00:00.248) 0:19:13.732 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:20 Wednesday 18 March 2026 19:57:49 -0400 (0:00:00.267) 0:19:13.999 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV stripe size] ****************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:27 Wednesday 18 March 2026 19:57:50 -0400 (0:00:00.265) 0:19:14.265 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested stripe size] ***************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:31 Wednesday 18 March 2026 19:57:50 -0400 (0:00:00.317) 0:19:14.582 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected stripe size] ************************************************ task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:37 Wednesday 18 March 2026 19:57:50 -0400 (0:00:00.303) 0:19:14.885 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check stripe size] ******************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:42 Wednesday 18 March 2026 19:57:51 -0400 (0:00:00.331) 0:19:15.217 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:94 Wednesday 18 March 2026 19:57:51 -0400 (0:00:00.246) 0:19:15.463 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node4 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Wednesday 18 March 2026 19:57:52 -0400 (0:00:00.968) 0:19:16.432 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml for managed-node4 TASK [Get information about thinpool] ****************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:8 Wednesday 18 March 2026 19:57:53 -0400 (0:00:00.917) 0:19:17.350 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in correct thinpool (when thinp name is provided)] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:16 Wednesday 18 March 2026 19:57:53 -0400 (0:00:00.338) 0:19:17.688 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in thinpool (when thinp name is not provided)] ****** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:22 Wednesday 18 March 2026 19:57:53 -0400 (0:00:00.392) 0:19:18.080 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:26 Wednesday 18 March 2026 19:57:54 -0400 (0:00:00.385) 0:19:18.466 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_thin_status": null }, "changed": false } TASK [Check member encryption] ************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:97 Wednesday 18 March 2026 19:57:54 -0400 (0:00:00.292) 0:19:18.759 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node4 TASK [Set test variables] ****************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Wednesday 18 March 2026 19:57:55 -0400 (0:00:00.798) 0:19:19.557 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Wednesday 18 March 2026 19:57:55 -0400 (0:00:00.465) 0:19:20.023 ******* skipping: [managed-node4] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Wednesday 18 March 2026 19:57:56 -0400 (0:00:00.485) 0:19:20.508 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml for managed-node4 TASK [Set variables used by tests] ********************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:2 Wednesday 18 March 2026 19:57:56 -0400 (0:00:00.544) 0:19:21.053 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:6 Wednesday 18 March 2026 19:57:57 -0400 (0:00:00.543) 0:19:21.597 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:14 Wednesday 18 March 2026 19:57:57 -0400 (0:00:00.331) 0:19:21.929 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:23 Wednesday 18 March 2026 19:57:58 -0400 (0:00:00.217) 0:19:22.146 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:32 Wednesday 18 March 2026 19:57:58 -0400 (0:00:00.274) 0:19:22.421 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:41 Wednesday 18 March 2026 19:57:58 -0400 (0:00:00.334) 0:19:22.755 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [Clear test variables] **************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Wednesday 18 March 2026 19:57:58 -0400 (0:00:00.265) 0:19:23.020 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:100 Wednesday 18 March 2026 19:57:59 -0400 (0:00:00.349) 0:19:23.370 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node4 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Wednesday 18 March 2026 19:58:00 -0400 (0:00:00.761) 0:19:24.132 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml for managed-node4 TASK [Get information about VDO deduplication] ********************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:8 Wednesday 18 March 2026 19:58:00 -0400 (0:00:00.737) 0:19:24.869 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:15 Wednesday 18 March 2026 19:58:01 -0400 (0:00:00.319) 0:19:25.189 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:21 Wednesday 18 March 2026 19:58:01 -0400 (0:00:00.378) 0:19:25.568 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about VDO compression] *********************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:27 Wednesday 18 March 2026 19:58:01 -0400 (0:00:00.261) 0:19:25.830 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off - 2] *********************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:34 Wednesday 18 March 2026 19:58:02 -0400 (0:00:00.333) 0:19:26.163 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on - 2] ************************************ task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:40 Wednesday 18 March 2026 19:58:02 -0400 (0:00:00.332) 0:19:26.496 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:46 Wednesday 18 March 2026 19:58:02 -0400 (0:00:00.331) 0:19:26.827 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:103 Wednesday 18 March 2026 19:58:03 -0400 (0:00:00.318) 0:19:27.146 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node4 TASK [Get stratis pool information] ******************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Wednesday 18 March 2026 19:58:03 -0400 (0:00:00.885) 0:19:28.031 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print script output] ***************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Wednesday 18 March 2026 19:58:04 -0400 (0:00:00.422) 0:19:28.454 ******* skipping: [managed-node4] => {} TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:19 Wednesday 18 March 2026 19:58:04 -0400 (0:00:00.223) 0:19:28.677 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:23 Wednesday 18 March 2026 19:58:04 -0400 (0:00:00.227) 0:19:28.905 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:30 Wednesday 18 March 2026 19:58:05 -0400 (0:00:00.266) 0:19:29.172 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:39 Wednesday 18 March 2026 19:58:05 -0400 (0:00:00.366) 0:19:29.538 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:49 Wednesday 18 March 2026 19:58:05 -0400 (0:00:00.308) 0:19:29.847 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:106 Wednesday 18 March 2026 19:58:06 -0400 (0:00:00.356) 0:19:30.203 ******* ok: [managed-node4] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Wednesday 18 March 2026 19:58:06 -0400 (0:00:00.331) 0:19:30.535 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node4 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Wednesday 18 March 2026 19:58:06 -0400 (0:00:00.519) 0:19:31.054 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Wednesday 18 March 2026 19:58:07 -0400 (0:00:00.484) 0:19:31.539 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node4 included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node4 included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node4 included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node4 included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node4 included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node4 included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node4 included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node4 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Wednesday 18 March 2026 19:58:10 -0400 (0:00:02.736) 0:19:34.276 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-64ca6b26-d808-4080-846a-f5a544a68787" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Wednesday 18 March 2026 19:58:10 -0400 (0:00:00.454) 0:19:34.730 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Wednesday 18 March 2026 19:58:10 -0400 (0:00:00.338) 0:19:35.069 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Wednesday 18 March 2026 19:58:11 -0400 (0:00:00.355) 0:19:35.425 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Wednesday 18 March 2026 19:58:11 -0400 (0:00:00.256) 0:19:35.681 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Wednesday 18 March 2026 19:58:12 -0400 (0:00:00.609) 0:19:36.291 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Wednesday 18 March 2026 19:58:12 -0400 (0:00:00.340) 0:19:36.631 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Wednesday 18 March 2026 19:58:12 -0400 (0:00:00.338) 0:19:36.970 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Wednesday 18 March 2026 19:58:13 -0400 (0:00:00.316) 0:19:37.286 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Wednesday 18 March 2026 19:58:13 -0400 (0:00:00.231) 0:19:37.517 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Wednesday 18 March 2026 19:58:13 -0400 (0:00:00.317) 0:19:37.835 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Wednesday 18 March 2026 19:58:14 -0400 (0:00:00.371) 0:19:38.207 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-64ca6b26-d808-4080-846a-f5a544a68787 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Wednesday 18 March 2026 19:58:14 -0400 (0:00:00.692) 0:19:38.900 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Wednesday 18 March 2026 19:58:15 -0400 (0:00:00.412) 0:19:39.312 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Wednesday 18 March 2026 19:58:15 -0400 (0:00:00.428) 0:19:39.741 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Wednesday 18 March 2026 19:58:15 -0400 (0:00:00.313) 0:19:40.054 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Wednesday 18 March 2026 19:58:16 -0400 (0:00:00.419) 0:19:40.474 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Wednesday 18 March 2026 19:58:16 -0400 (0:00:00.379) 0:19:40.854 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Wednesday 18 March 2026 19:58:17 -0400 (0:00:00.581) 0:19:41.436 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Wednesday 18 March 2026 19:58:17 -0400 (0:00:00.447) 0:19:41.883 ******* ok: [managed-node4] => { "changed": false, "stat": { "atime": 1773878217.614261, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1773878217.614261, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 168997, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1773878217.614261, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Wednesday 18 March 2026 19:58:19 -0400 (0:00:01.922) 0:19:43.805 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Wednesday 18 March 2026 19:58:20 -0400 (0:00:00.353) 0:19:44.159 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Wednesday 18 March 2026 19:58:20 -0400 (0:00:00.357) 0:19:44.516 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Wednesday 18 March 2026 19:58:20 -0400 (0:00:00.375) 0:19:44.892 ******* ok: [managed-node4] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Wednesday 18 March 2026 19:58:21 -0400 (0:00:00.360) 0:19:45.252 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Wednesday 18 March 2026 19:58:21 -0400 (0:00:00.295) 0:19:45.548 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Wednesday 18 March 2026 19:58:21 -0400 (0:00:00.533) 0:19:46.082 ******* ok: [managed-node4] => { "changed": false, "stat": { "atime": 1773878217.7212613, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1773878217.7212613, "dev": 5, "device_type": 64769, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 169050, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1773878217.7212613, "nlink": 1, "path": "/dev/mapper/luks-64ca6b26-d808-4080-846a-f5a544a68787", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Wednesday 18 March 2026 19:58:24 -0400 (0:00:02.035) 0:19:48.117 ******* ok: [managed-node4] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed" ] } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Wednesday 18 March 2026 19:58:27 -0400 (0:00:03.432) 0:19:51.549 ******* ok: [managed-node4] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/mapper/foo-test1" ], "delta": "0:00:00.024781", "end": "2026-03-18 19:58:28.886567", "rc": 0, "start": "2026-03-18 19:58:28.861786" } STDOUT: LUKS header information for /dev/mapper/foo-test1 Version: 1 Cipher name: aes Cipher mode: xts-plain64 Hash spec: sha256 Payload offset: 8192 MK bits: 512 MK digest: cc 6e 4c c3 62 07 69 ca 28 7b 9a 7a 70 e2 5d f8 03 ff f3 b7 MK salt: 6f 13 8b 44 48 d2 10 33 6a 95 a1 69 c6 f6 03 e8 0e 34 ce d1 e4 63 f1 b7 01 af ae e9 be cc c9 89 MK iterations: 23108 UUID: 64ca6b26-d808-4080-846a-f5a544a68787 Key Slot 0: ENABLED Iterations: 369216 Salt: 6b 2f 61 57 9e dc d8 4b af 85 06 2c 7b f0 c6 24 95 a2 61 38 dc 79 1f 98 81 d6 c6 b4 78 f8 05 96 Key material offset: 8 AF stripes: 4000 Key Slot 1: DISABLED Key Slot 2: DISABLED Key Slot 3: DISABLED Key Slot 4: DISABLED Key Slot 5: DISABLED Key Slot 6: DISABLED Key Slot 7: DISABLED TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Wednesday 18 March 2026 19:58:29 -0400 (0:00:01.783) 0:19:53.332 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Wednesday 18 March 2026 19:58:29 -0400 (0:00:00.461) 0:19:53.794 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Wednesday 18 March 2026 19:58:30 -0400 (0:00:00.448) 0:19:54.242 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Wednesday 18 March 2026 19:58:30 -0400 (0:00:00.371) 0:19:54.614 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Wednesday 18 March 2026 19:58:30 -0400 (0:00:00.424) 0:19:55.038 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Wednesday 18 March 2026 19:58:31 -0400 (0:00:00.545) 0:19:55.584 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Wednesday 18 March 2026 19:58:32 -0400 (0:00:00.556) 0:19:56.140 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Set test variables] ****************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Wednesday 18 March 2026 19:58:32 -0400 (0:00:00.496) 0:19:56.637 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-64ca6b26-d808-4080-846a-f5a544a68787 /dev/mapper/foo-test1 -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Wednesday 18 March 2026 19:58:32 -0400 (0:00:00.291) 0:19:56.928 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Wednesday 18 March 2026 19:58:33 -0400 (0:00:00.368) 0:19:57.296 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Wednesday 18 March 2026 19:58:33 -0400 (0:00:00.453) 0:19:57.750 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Wednesday 18 March 2026 19:58:34 -0400 (0:00:00.505) 0:19:58.256 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Wednesday 18 March 2026 19:58:34 -0400 (0:00:00.415) 0:19:58.671 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Wednesday 18 March 2026 19:58:34 -0400 (0:00:00.317) 0:19:58.989 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Wednesday 18 March 2026 19:58:35 -0400 (0:00:00.315) 0:19:59.304 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Wednesday 18 March 2026 19:58:35 -0400 (0:00:00.400) 0:19:59.704 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Wednesday 18 March 2026 19:58:35 -0400 (0:00:00.215) 0:19:59.920 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Wednesday 18 March 2026 19:58:36 -0400 (0:00:00.366) 0:20:00.287 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Wednesday 18 March 2026 19:58:36 -0400 (0:00:00.302) 0:20:00.590 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Wednesday 18 March 2026 19:58:36 -0400 (0:00:00.305) 0:20:00.896 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Wednesday 18 March 2026 19:58:37 -0400 (0:00:00.340) 0:20:01.236 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Wednesday 18 March 2026 19:58:37 -0400 (0:00:00.334) 0:20:01.571 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Wednesday 18 March 2026 19:58:37 -0400 (0:00:00.304) 0:20:01.875 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Wednesday 18 March 2026 19:58:38 -0400 (0:00:00.403) 0:20:02.279 ******* ok: [managed-node4] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Wednesday 18 March 2026 19:58:42 -0400 (0:00:04.595) 0:20:06.874 ******* ok: [managed-node4] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Wednesday 18 March 2026 19:58:44 -0400 (0:00:02.074) 0:20:08.948 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [Show expected size] ****************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Wednesday 18 March 2026 19:58:45 -0400 (0:00:00.466) 0:20:09.414 ******* ok: [managed-node4] => { "storage_test_expected_size": "4294967296" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Wednesday 18 March 2026 19:58:45 -0400 (0:00:00.404) 0:20:09.819 ******* ok: [managed-node4] => { "bytes": 10737418240, "changed": false, "lvm": "10g", "parted": "10GiB", "size": "10 GiB" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Wednesday 18 March 2026 19:58:47 -0400 (0:00:01.978) 0:20:11.797 ******* skipping: [managed-node4] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Wednesday 18 March 2026 19:58:48 -0400 (0:00:00.400) 0:20:12.198 ******* skipping: [managed-node4] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Wednesday 18 March 2026 19:58:48 -0400 (0:00:00.385) 0:20:12.584 ******* skipping: [managed-node4] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Wednesday 18 March 2026 19:58:48 -0400 (0:00:00.247) 0:20:12.831 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Wednesday 18 March 2026 19:58:49 -0400 (0:00:00.440) 0:20:13.272 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Wednesday 18 March 2026 19:58:49 -0400 (0:00:00.320) 0:20:13.592 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Wednesday 18 March 2026 19:58:49 -0400 (0:00:00.251) 0:20:13.844 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Wednesday 18 March 2026 19:58:50 -0400 (0:00:00.372) 0:20:14.217 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Wednesday 18 March 2026 19:58:50 -0400 (0:00:00.370) 0:20:14.588 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Wednesday 18 March 2026 19:58:50 -0400 (0:00:00.329) 0:20:14.918 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Wednesday 18 March 2026 19:58:51 -0400 (0:00:00.413) 0:20:15.331 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Wednesday 18 March 2026 19:58:51 -0400 (0:00:00.314) 0:20:15.645 ******* skipping: [managed-node4] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Wednesday 18 March 2026 19:58:51 -0400 (0:00:00.424) 0:20:16.070 ******* skipping: [managed-node4] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Wednesday 18 March 2026 19:58:52 -0400 (0:00:00.218) 0:20:16.289 ******* skipping: [managed-node4] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Wednesday 18 March 2026 19:58:52 -0400 (0:00:00.325) 0:20:16.614 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Wednesday 18 March 2026 19:58:52 -0400 (0:00:00.228) 0:20:16.842 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Wednesday 18 March 2026 19:58:53 -0400 (0:00:00.352) 0:20:17.195 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Wednesday 18 March 2026 19:58:53 -0400 (0:00:00.251) 0:20:17.446 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Wednesday 18 March 2026 19:58:53 -0400 (0:00:00.264) 0:20:17.711 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Wednesday 18 March 2026 19:58:53 -0400 (0:00:00.262) 0:20:17.973 ******* ok: [managed-node4] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Wednesday 18 March 2026 19:58:54 -0400 (0:00:00.274) 0:20:18.248 ******* ok: [managed-node4] => { "storage_test_expected_size": "4294967296" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Wednesday 18 March 2026 19:58:54 -0400 (0:00:00.288) 0:20:18.536 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Wednesday 18 March 2026 19:58:54 -0400 (0:00:00.442) 0:20:18.978 ******* ok: [managed-node4] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.019454", "end": "2026-03-18 19:58:56.528723", "rc": 0, "start": "2026-03-18 19:58:56.509269" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Wednesday 18 March 2026 19:58:56 -0400 (0:00:02.078) 0:20:21.056 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [Check segment type] ****************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Wednesday 18 March 2026 19:58:57 -0400 (0:00:00.406) 0:20:21.463 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Wednesday 18 March 2026 19:58:57 -0400 (0:00:00.498) 0:20:21.962 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Wednesday 18 March 2026 19:58:58 -0400 (0:00:00.362) 0:20:22.325 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Wednesday 18 March 2026 19:58:58 -0400 (0:00:00.388) 0:20:22.714 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Wednesday 18 March 2026 19:58:58 -0400 (0:00:00.361) 0:20:23.075 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Wednesday 18 March 2026 19:58:59 -0400 (0:00:00.415) 0:20:23.490 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Wednesday 18 March 2026 19:58:59 -0400 (0:00:00.332) 0:20:23.823 ******* TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Wednesday 18 March 2026 19:58:59 -0400 (0:00:00.262) 0:20:24.085 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Verify preservation of encryption settings on existing LVM volume] ******* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:391 Wednesday 18 March 2026 19:59:00 -0400 (0:00:00.385) 0:20:24.470 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node4 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:24 Wednesday 18 March 2026 19:59:00 -0400 (0:00:00.546) 0:20:25.017 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:34 Wednesday 18 March 2026 19:59:01 -0400 (0:00:00.228) 0:20:25.245 ******* TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Wednesday 18 March 2026 19:59:01 -0400 (0:00:00.438) 0:20:25.684 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Wednesday 18 March 2026 19:59:01 -0400 (0:00:00.354) 0:20:26.039 ******* ok: [managed-node4] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Wednesday 18 March 2026 19:59:05 -0400 (0:00:03.201) 0:20:29.240 ******* skipping: [managed-node4] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node4] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node4] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-loop", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed-node4] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Wednesday 18 March 2026 19:59:05 -0400 (0:00:00.706) 0:20:29.946 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Wednesday 18 March 2026 19:59:06 -0400 (0:00:00.272) 0:20:30.219 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Wednesday 18 March 2026 19:59:06 -0400 (0:00:00.324) 0:20:30.544 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Wednesday 18 March 2026 19:59:06 -0400 (0:00:00.433) 0:20:30.977 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Wednesday 18 March 2026 19:59:07 -0400 (0:00:00.459) 0:20:31.436 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Wednesday 18 March 2026 19:59:08 -0400 (0:00:00.955) 0:20:32.392 ******* ok: [managed-node4] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-loop-2.18-5.el7.x86_64 providing libblockdev-loop is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Wednesday 18 March 2026 19:59:12 -0400 (0:00:03.973) 0:20:36.365 ******* ok: [managed-node4] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Wednesday 18 March 2026 19:59:12 -0400 (0:00:00.457) 0:20:36.823 ******* ok: [managed-node4] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Wednesday 18 March 2026 19:59:13 -0400 (0:00:00.330) 0:20:37.153 ******* ok: [managed-node4] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Wednesday 18 March 2026 19:59:18 -0400 (0:00:05.776) 0:20:42.930 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Wednesday 18 March 2026 19:59:19 -0400 (0:00:00.606) 0:20:43.536 ******* TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Wednesday 18 March 2026 19:59:19 -0400 (0:00:00.174) 0:20:43.711 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Wednesday 18 March 2026 19:59:20 -0400 (0:00:00.418) 0:20:44.129 ******* TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Wednesday 18 March 2026 19:59:20 -0400 (0:00:00.268) 0:20:44.398 ******* ok: [managed-node4] => { "changed": false, "rc": 0, "results": [ "7:lvm2-2.02.187-6.el7_9.5.x86_64 providing lvm2 is already installed", "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Wednesday 18 March 2026 19:59:23 -0400 (0:00:03.168) 0:20:47.611 ******* ok: [managed-node4] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-cryptsetup@luks\\x2d0ebc5dd0\\x2d5035\\x2d41fd\\x2dadcc\\x2dae42ec9792ff.service": { "name": "systemd-cryptsetup@luks\\x2d0ebc5dd0\\x2d5035\\x2d41fd\\x2dadcc\\x2dae42ec9792ff.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Wednesday 18 March 2026 19:59:27 -0400 (0:00:03.985) 0:20:51.597 ******* changed: [managed-node4] => (item=systemd-cryptsetup@luks\x2d0ebc5dd0\x2d5035\x2d41fd\x2dadcc\x2dae42ec9792ff.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d0ebc5dd0\\x2d5035\\x2d41fd\\x2dadcc\\x2dae42ec9792ff.service", "name": "systemd-cryptsetup@luks\\x2d0ebc5dd0\\x2d5035\\x2d41fd\\x2dadcc\\x2dae42ec9792ff.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "cryptsetup-pre.target tmp.mount system-systemd\\x2dcryptsetup.slice systemd-readahead-collect.service dev-sda1.device systemd-journald.socket systemd-readahead-replay.service -.mount", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "umount.target cryptsetup.target", "BindsTo": "dev-sda1.device", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "Delegate": "no", "Description": "Cryptography Setup for luks-0ebc5dd0-5035-41fd-adcc-ae42ec9792ff", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-0ebc5dd0-5035-41fd-adcc-ae42ec9792ff /dev/sda1 /tmp/storage_test45gdlvlukskey ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-0ebc5dd0-5035-41fd-adcc-ae42ec9792ff ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d0ebc5dd0\\x2d5035\\x2d41fd\\x2dadcc\\x2dae42ec9792ff.service", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2d0ebc5dd0\\x2d5035\\x2d41fd\\x2dadcc\\x2dae42ec9792ff.service", "IgnoreOnIsolate": "yes", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "4096", "LimitNPROC": "14311", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "14311", "LimitSTACK": "18446744073709551615", "LoadState": "loaded", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2d0ebc5dd0\\x2d5035\\x2d41fd\\x2dadcc\\x2dae42ec9792ff.service", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RequiredBy": "cryptsetup.target", "Requires": "system-systemd\\x2dcryptsetup.slice -.mount", "RequiresMountsFor": "/tmp/storage_test45gdlvlukskey", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "0", "TimeoutStopUSec": "0", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "bad", "WantedBy": "dev-sda1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Wednesday 18 March 2026 19:59:30 -0400 (0:00:02.746) 0:20:54.343 ******* ok: [managed-node4] => { "actions": [], "changed": false, "crypts": [], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-64ca6b26-d808-4080-846a-f5a544a68787" ], "mounts": [ { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-64ca6b26-d808-4080-846a-f5a544a68787", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "e2fsprogs", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-64ca6b26-d808-4080-846a-f5a544a68787", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-64ca6b26-d808-4080-846a-f5a544a68787", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 Wednesday 18 March 2026 19:59:36 -0400 (0:00:05.808) 0:21:00.152 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Wednesday 18 March 2026 19:59:36 -0400 (0:00:00.443) 0:21:00.596 ******* ok: [managed-node4] => { "changed": false, "stat": { "atime": 1773878230.3092973, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "c644d8564554e92249a45b7f7a4406e856ca6e88", "ctime": 1773878230.3062973, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263969, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1773878230.3062973, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1311, "uid": 0, "version": "18446744071680134064", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:97 Wednesday 18 March 2026 19:59:38 -0400 (0:00:01.726) 0:21:02.322 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Wednesday 18 March 2026 19:59:38 -0400 (0:00:00.312) 0:21:02.635 ******* changed: [managed-node4] => (item=systemd-cryptsetup@luks\x2d0ebc5dd0\x2d5035\x2d41fd\x2dadcc\x2dae42ec9792ff.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d0ebc5dd0\\x2d5035\\x2d41fd\\x2dadcc\\x2dae42ec9792ff.service", "name": "systemd-cryptsetup@luks\\x2d0ebc5dd0\\x2d5035\\x2d41fd\\x2dadcc\\x2dae42ec9792ff.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ControlPID": "0", "DefaultDependencies": "yes", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d0ebc5dd0\\x2d5035\\x2d41fd\\x2dadcc\\x2dae42ec9792ff.service", "DevicePolicy": "auto", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2d0ebc5dd0\\x2d5035\\x2d41fd\\x2dadcc\\x2dae42ec9792ff.service", "IgnoreOnIsolate": "no", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "1048576", "LimitNPROC": "14311", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "14311", "LimitSTACK": "18446744073709551615", "LoadState": "masked", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2d0ebc5dd0\\x2d5035\\x2d41fd\\x2dadcc\\x2dae42ec9792ff.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UMask": "0022", "UnitFileState": "bad", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Wednesday 18 March 2026 19:59:40 -0400 (0:00:02.294) 0:21:04.929 ******* ok: [managed-node4] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-64ca6b26-d808-4080-846a-f5a544a68787" ], "mounts": [ { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-64ca6b26-d808-4080-846a-f5a544a68787", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "e2fsprogs", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-64ca6b26-d808-4080-846a-f5a544a68787", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-64ca6b26-d808-4080-846a-f5a544a68787", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:130 Wednesday 18 March 2026 19:59:41 -0400 (0:00:00.479) 0:21:05.408 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-64ca6b26-d808-4080-846a-f5a544a68787", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-64ca6b26-d808-4080-846a-f5a544a68787", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:134 Wednesday 18 March 2026 19:59:41 -0400 (0:00:00.405) 0:21:05.814 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:150 Wednesday 18 March 2026 19:59:42 -0400 (0:00:00.313) 0:21:06.128 ******* TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Wednesday 18 March 2026 19:59:42 -0400 (0:00:00.222) 0:21:06.351 ******* ok: [managed-node4] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166 Wednesday 18 March 2026 19:59:43 -0400 (0:00:01.656) 0:21:08.007 ******* ok: [managed-node4] => (item={u'src': u'/dev/mapper/luks-64ca6b26-d808-4080-846a-f5a544a68787', u'group': None, u'dump': 0, u'passno': 0, u'fstype': u'xfs', u'state': u'mounted', u'mode': None, u'owner': None, u'path': u'/opt/test1', u'opts': u'defaults'}) => { "ansible_loop_var": "mount_info", "changed": false, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-64ca6b26-d808-4080-846a-f5a544a68787", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-64ca6b26-d808-4080-846a-f5a544a68787" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:177 Wednesday 18 March 2026 19:59:45 -0400 (0:00:01.677) 0:21:09.685 ******* skipping: [managed-node4] => (item={u'src': u'/dev/mapper/luks-64ca6b26-d808-4080-846a-f5a544a68787', u'group': None, u'dump': 0, u'passno': 0, u'fstype': u'xfs', u'state': u'mounted', u'mode': None, u'owner': None, u'path': u'/opt/test1', u'opts': u'defaults'}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-64ca6b26-d808-4080-846a-f5a544a68787", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:189 Wednesday 18 March 2026 19:59:45 -0400 (0:00:00.376) 0:21:10.061 ******* ok: [managed-node4] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 Wednesday 18 March 2026 19:59:47 -0400 (0:00:01.603) 0:21:11.665 ******* ok: [managed-node4] => { "changed": false, "stat": { "atime": 1773878249.8223534, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "7b0ef0712d0af9b50e580058dadf397ff61e2c96", "ctime": 1773878239.2583232, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 264034, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1773878239.2583232, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 66, "uid": 0, "version": "18446744071680136279", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:202 Wednesday 18 March 2026 19:59:49 -0400 (0:00:01.692) 0:21:13.357 ******* TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 Wednesday 18 March 2026 19:59:49 -0400 (0:00:00.303) 0:21:13.660 ******* ok: [managed-node4] TASK [Assert preservation of encryption settings on existing LVM volume] ******* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:404 Wednesday 18 March 2026 19:59:51 -0400 (0:00:02.370) 0:21:16.031 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify role results - 8] ************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:411 Wednesday 18 March 2026 19:59:52 -0400 (0:00:00.507) 0:21:16.538 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node4 TASK [Print out pool information] ********************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Wednesday 18 March 2026 19:59:53 -0400 (0:00:00.578) 0:21:17.117 ******* ok: [managed-node4] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-64ca6b26-d808-4080-846a-f5a544a68787", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-64ca6b26-d808-4080-846a-f5a544a68787", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Wednesday 18 March 2026 19:59:53 -0400 (0:00:00.497) 0:21:17.614 ******* skipping: [managed-node4] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Wednesday 18 March 2026 19:59:53 -0400 (0:00:00.389) 0:21:18.004 ******* ok: [managed-node4] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "64ca6b26-d808-4080-846a-f5a544a68787" }, "/dev/mapper/luks-64ca6b26-d808-4080-846a-f5a544a68787": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-64ca6b26-d808-4080-846a-f5a544a68787", "size": "4G", "type": "crypt", "uuid": "f0d18298-00c4-4619-b1a0-0f28d9fd0c9d" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "qrc5qj-1or5-aUH1-hq0R-233J-AQ7d-XEcSEK" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "ext4", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "c7b7d6a5-fd01-4b9b-bcca-153eaff9d312" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Wednesday 18 March 2026 19:59:55 -0400 (0:00:01.574) 0:21:19.579 ******* ok: [managed-node4] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002968", "end": "2026-03-18 19:59:56.978144", "rc": 0, "start": "2026-03-18 19:59:56.975176" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Thu Jun 20 10:23:46 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk' # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info # UUID=c7b7d6a5-fd01-4b9b-bcca-153eaff9d312 / ext4 defaults 1 1 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-64ca6b26-d808-4080-846a-f5a544a68787 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Wednesday 18 March 2026 19:59:57 -0400 (0:00:01.877) 0:21:21.456 ******* ok: [managed-node4] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002934", "end": "2026-03-18 19:59:58.927038", "failed_when_result": false, "rc": 0, "start": "2026-03-18 19:59:58.924104" } STDOUT: luks-64ca6b26-d808-4080-846a-f5a544a68787 /dev/mapper/foo-test1 - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Wednesday 18 March 2026 19:59:59 -0400 (0:00:01.812) 0:21:23.269 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node4 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Wednesday 18 March 2026 20:00:00 -0400 (0:00:00.853) 0:21:24.123 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Wednesday 18 March 2026 20:00:00 -0400 (0:00:00.339) 0:21:24.462 ******* ok: [managed-node4] => { "changed": false, "cmd": [ "vgs", "--noheadings", "--binary", "-o", "shared", "foo" ], "delta": "0:00:00.018054", "end": "2026-03-18 20:00:02.173862", "rc": 0, "start": "2026-03-18 20:00:02.155808" } STDOUT: 0 TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Wednesday 18 March 2026 20:00:02 -0400 (0:00:02.351) 0:21:26.814 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Wednesday 18 March 2026 20:00:03 -0400 (0:00:00.504) 0:21:27.318 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node4 included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node4 TASK [Set test variables] ****************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Wednesday 18 March 2026 20:00:04 -0400 (0:00:00.858) 0:21:28.176 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Wednesday 18 March 2026 20:00:04 -0400 (0:00:00.532) 0:21:28.709 ******* ok: [managed-node4] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Wednesday 18 March 2026 20:00:06 -0400 (0:00:02.088) 0:21:30.797 ******* ok: [managed-node4] => { "ansible_facts": { "__pvs_lvm_len": "1" }, "changed": false } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Wednesday 18 March 2026 20:00:07 -0400 (0:00:00.379) 0:21:31.176 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "changed": false } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Wednesday 18 March 2026 20:00:07 -0400 (0:00:00.392) 0:21:31.568 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Set expected pv type] **************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Wednesday 18 March 2026 20:00:07 -0400 (0:00:00.252) 0:21:31.821 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type - 2] ************************************************ task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Wednesday 18 March 2026 20:00:08 -0400 (0:00:00.445) 0:21:32.267 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type - 3] ************************************************ task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Wednesday 18 March 2026 20:00:08 -0400 (0:00:00.396) 0:21:32.663 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:55 Wednesday 18 March 2026 20:00:08 -0400 (0:00:00.288) 0:21:32.951 ******* ok: [managed-node4] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:68 Wednesday 18 March 2026 20:00:09 -0400 (0:00:00.633) 0:21:33.585 ******* ok: [managed-node4] => { "changed": false, "failed_when_result": false, "rc": 1 } STDERR: Shared connection to 10.31.44.104 closed. MSG: non-zero return code TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:78 Wednesday 18 March 2026 20:00:11 -0400 (0:00:01.922) 0:21:35.508 ******* skipping: [managed-node4] => (item=/dev/sda) => { "ansible_loop_var": "st_pool_pv", "changed": false, "skip_reason": "Conditional result was False", "st_pool_pv": "/dev/sda" } TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:88 Wednesday 18 March 2026 20:00:11 -0400 (0:00:00.356) 0:21:35.864 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node4 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Wednesday 18 March 2026 20:00:12 -0400 (0:00:00.582) 0:21:36.447 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Wednesday 18 March 2026 20:00:12 -0400 (0:00:00.341) 0:21:36.789 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Wednesday 18 March 2026 20:00:13 -0400 (0:00:00.316) 0:21:37.105 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Wednesday 18 March 2026 20:00:13 -0400 (0:00:00.294) 0:21:37.399 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Wednesday 18 March 2026 20:00:13 -0400 (0:00:00.370) 0:21:37.770 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Wednesday 18 March 2026 20:00:13 -0400 (0:00:00.312) 0:21:38.082 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Wednesday 18 March 2026 20:00:14 -0400 (0:00:00.429) 0:21:38.512 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Wednesday 18 March 2026 20:00:14 -0400 (0:00:00.354) 0:21:38.866 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Wednesday 18 March 2026 20:00:15 -0400 (0:00:00.357) 0:21:39.223 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Wednesday 18 March 2026 20:00:15 -0400 (0:00:00.329) 0:21:39.553 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Wednesday 18 March 2026 20:00:15 -0400 (0:00:00.382) 0:21:39.935 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:91 Wednesday 18 March 2026 20:00:16 -0400 (0:00:00.305) 0:21:40.241 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node4 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Wednesday 18 March 2026 20:00:16 -0400 (0:00:00.759) 0:21:41.001 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml for managed-node4 TASK [Get information about the LV] ******************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:8 Wednesday 18 March 2026 20:00:17 -0400 (0:00:00.679) 0:21:41.680 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:16 Wednesday 18 March 2026 20:00:17 -0400 (0:00:00.325) 0:21:42.005 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:20 Wednesday 18 March 2026 20:00:18 -0400 (0:00:00.386) 0:21:42.392 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV stripe size] ****************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:27 Wednesday 18 March 2026 20:00:18 -0400 (0:00:00.379) 0:21:42.772 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested stripe size] ***************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:31 Wednesday 18 March 2026 20:00:19 -0400 (0:00:00.355) 0:21:43.127 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected stripe size] ************************************************ task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:37 Wednesday 18 March 2026 20:00:19 -0400 (0:00:00.365) 0:21:43.493 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check stripe size] ******************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:42 Wednesday 18 March 2026 20:00:19 -0400 (0:00:00.352) 0:21:43.845 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:94 Wednesday 18 March 2026 20:00:20 -0400 (0:00:00.339) 0:21:44.185 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node4 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Wednesday 18 March 2026 20:00:20 -0400 (0:00:00.789) 0:21:44.975 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml for managed-node4 TASK [Get information about thinpool] ****************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:8 Wednesday 18 March 2026 20:00:21 -0400 (0:00:00.899) 0:21:45.874 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in correct thinpool (when thinp name is provided)] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:16 Wednesday 18 March 2026 20:00:22 -0400 (0:00:00.396) 0:21:46.270 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in thinpool (when thinp name is not provided)] ****** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:22 Wednesday 18 March 2026 20:00:22 -0400 (0:00:00.356) 0:21:46.627 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:26 Wednesday 18 March 2026 20:00:22 -0400 (0:00:00.282) 0:21:46.910 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_thin_status": null }, "changed": false } TASK [Check member encryption] ************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:97 Wednesday 18 March 2026 20:00:23 -0400 (0:00:00.345) 0:21:47.255 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node4 TASK [Set test variables] ****************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Wednesday 18 March 2026 20:00:23 -0400 (0:00:00.786) 0:21:48.042 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Wednesday 18 March 2026 20:00:24 -0400 (0:00:00.327) 0:21:48.369 ******* skipping: [managed-node4] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Wednesday 18 March 2026 20:00:24 -0400 (0:00:00.332) 0:21:48.702 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml for managed-node4 TASK [Set variables used by tests] ********************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:2 Wednesday 18 March 2026 20:00:25 -0400 (0:00:00.606) 0:21:49.308 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:6 Wednesday 18 March 2026 20:00:25 -0400 (0:00:00.275) 0:21:49.584 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:14 Wednesday 18 March 2026 20:00:25 -0400 (0:00:00.391) 0:21:49.975 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:23 Wednesday 18 March 2026 20:00:26 -0400 (0:00:00.287) 0:21:50.263 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:32 Wednesday 18 March 2026 20:00:26 -0400 (0:00:00.360) 0:21:50.624 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:41 Wednesday 18 March 2026 20:00:26 -0400 (0:00:00.192) 0:21:50.816 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [Clear test variables] **************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Wednesday 18 March 2026 20:00:27 -0400 (0:00:00.376) 0:21:51.193 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:100 Wednesday 18 March 2026 20:00:27 -0400 (0:00:00.280) 0:21:51.474 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node4 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Wednesday 18 March 2026 20:00:28 -0400 (0:00:00.744) 0:21:52.218 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml for managed-node4 TASK [Get information about VDO deduplication] ********************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:8 Wednesday 18 March 2026 20:00:28 -0400 (0:00:00.668) 0:21:52.887 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:15 Wednesday 18 March 2026 20:00:29 -0400 (0:00:00.313) 0:21:53.200 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:21 Wednesday 18 March 2026 20:00:29 -0400 (0:00:00.355) 0:21:53.556 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about VDO compression] *********************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:27 Wednesday 18 March 2026 20:00:29 -0400 (0:00:00.245) 0:21:53.802 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off - 2] *********************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:34 Wednesday 18 March 2026 20:00:30 -0400 (0:00:01.232) 0:21:55.034 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on - 2] ************************************ task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:40 Wednesday 18 March 2026 20:00:31 -0400 (0:00:00.351) 0:21:55.386 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:46 Wednesday 18 March 2026 20:00:31 -0400 (0:00:00.370) 0:21:55.756 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:103 Wednesday 18 March 2026 20:00:32 -0400 (0:00:00.342) 0:21:56.099 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node4 TASK [Get stratis pool information] ******************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Wednesday 18 March 2026 20:00:32 -0400 (0:00:00.823) 0:21:56.922 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print script output] ***************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Wednesday 18 March 2026 20:00:33 -0400 (0:00:00.311) 0:21:57.233 ******* skipping: [managed-node4] => {} TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:19 Wednesday 18 March 2026 20:00:33 -0400 (0:00:00.298) 0:21:57.532 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:23 Wednesday 18 March 2026 20:00:33 -0400 (0:00:00.319) 0:21:57.851 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:30 Wednesday 18 March 2026 20:00:34 -0400 (0:00:00.251) 0:21:58.103 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:39 Wednesday 18 March 2026 20:00:34 -0400 (0:00:00.325) 0:21:58.428 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:49 Wednesday 18 March 2026 20:00:34 -0400 (0:00:00.317) 0:21:58.746 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:106 Wednesday 18 March 2026 20:00:35 -0400 (0:00:00.363) 0:21:59.109 ******* ok: [managed-node4] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Wednesday 18 March 2026 20:00:35 -0400 (0:00:00.277) 0:21:59.387 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node4 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Wednesday 18 March 2026 20:00:36 -0400 (0:00:00.709) 0:22:00.096 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Wednesday 18 March 2026 20:00:36 -0400 (0:00:00.378) 0:22:00.475 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node4 included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node4 included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node4 included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node4 included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node4 included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node4 included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node4 included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node4 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Wednesday 18 March 2026 20:00:38 -0400 (0:00:01.860) 0:22:02.335 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-64ca6b26-d808-4080-846a-f5a544a68787" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Wednesday 18 March 2026 20:00:38 -0400 (0:00:00.280) 0:22:02.616 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Wednesday 18 March 2026 20:00:38 -0400 (0:00:00.331) 0:22:02.948 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Wednesday 18 March 2026 20:00:39 -0400 (0:00:00.370) 0:22:03.319 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Wednesday 18 March 2026 20:00:39 -0400 (0:00:00.390) 0:22:03.710 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Wednesday 18 March 2026 20:00:40 -0400 (0:00:00.417) 0:22:04.128 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Wednesday 18 March 2026 20:00:40 -0400 (0:00:00.358) 0:22:04.486 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Wednesday 18 March 2026 20:00:40 -0400 (0:00:00.405) 0:22:04.892 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Wednesday 18 March 2026 20:00:41 -0400 (0:00:00.395) 0:22:05.287 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Wednesday 18 March 2026 20:00:41 -0400 (0:00:00.312) 0:22:05.599 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Wednesday 18 March 2026 20:00:41 -0400 (0:00:00.245) 0:22:05.845 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Wednesday 18 March 2026 20:00:42 -0400 (0:00:00.386) 0:22:06.231 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-64ca6b26-d808-4080-846a-f5a544a68787 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Wednesday 18 March 2026 20:00:42 -0400 (0:00:00.674) 0:22:06.905 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Wednesday 18 March 2026 20:00:43 -0400 (0:00:00.449) 0:22:07.355 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Wednesday 18 March 2026 20:00:43 -0400 (0:00:00.387) 0:22:07.742 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Wednesday 18 March 2026 20:00:43 -0400 (0:00:00.247) 0:22:07.990 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Wednesday 18 March 2026 20:00:44 -0400 (0:00:00.515) 0:22:08.505 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Wednesday 18 March 2026 20:00:44 -0400 (0:00:00.307) 0:22:08.813 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Wednesday 18 March 2026 20:00:45 -0400 (0:00:00.516) 0:22:09.330 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Wednesday 18 March 2026 20:00:45 -0400 (0:00:00.401) 0:22:09.731 ******* ok: [managed-node4] => { "changed": false, "stat": { "atime": 1773878308.8775232, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1773878217.614261, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 168997, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1773878217.614261, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Wednesday 18 March 2026 20:00:47 -0400 (0:00:01.848) 0:22:11.580 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Wednesday 18 March 2026 20:00:47 -0400 (0:00:00.348) 0:22:11.929 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Wednesday 18 March 2026 20:00:48 -0400 (0:00:00.302) 0:22:12.231 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Wednesday 18 March 2026 20:00:48 -0400 (0:00:00.344) 0:22:12.576 ******* ok: [managed-node4] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Wednesday 18 March 2026 20:00:48 -0400 (0:00:00.373) 0:22:12.950 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Wednesday 18 March 2026 20:00:49 -0400 (0:00:00.293) 0:22:13.243 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Wednesday 18 March 2026 20:00:49 -0400 (0:00:00.434) 0:22:13.677 ******* ok: [managed-node4] => { "changed": false, "stat": { "atime": 1773878217.7212613, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1773878217.7212613, "dev": 5, "device_type": 64769, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 169050, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1773878217.7212613, "nlink": 1, "path": "/dev/mapper/luks-64ca6b26-d808-4080-846a-f5a544a68787", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Wednesday 18 March 2026 20:00:51 -0400 (0:00:01.880) 0:22:15.558 ******* ok: [managed-node4] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed" ] } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Wednesday 18 March 2026 20:00:54 -0400 (0:00:03.523) 0:22:19.081 ******* ok: [managed-node4] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/mapper/foo-test1" ], "delta": "0:00:00.026272", "end": "2026-03-18 20:00:56.796175", "rc": 0, "start": "2026-03-18 20:00:56.769903" } STDOUT: LUKS header information for /dev/mapper/foo-test1 Version: 1 Cipher name: aes Cipher mode: xts-plain64 Hash spec: sha256 Payload offset: 8192 MK bits: 512 MK digest: cc 6e 4c c3 62 07 69 ca 28 7b 9a 7a 70 e2 5d f8 03 ff f3 b7 MK salt: 6f 13 8b 44 48 d2 10 33 6a 95 a1 69 c6 f6 03 e8 0e 34 ce d1 e4 63 f1 b7 01 af ae e9 be cc c9 89 MK iterations: 23108 UUID: 64ca6b26-d808-4080-846a-f5a544a68787 Key Slot 0: ENABLED Iterations: 369216 Salt: 6b 2f 61 57 9e dc d8 4b af 85 06 2c 7b f0 c6 24 95 a2 61 38 dc 79 1f 98 81 d6 c6 b4 78 f8 05 96 Key material offset: 8 AF stripes: 4000 Key Slot 1: DISABLED Key Slot 2: DISABLED Key Slot 3: DISABLED Key Slot 4: DISABLED Key Slot 5: DISABLED Key Slot 6: DISABLED Key Slot 7: DISABLED TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Wednesday 18 March 2026 20:00:57 -0400 (0:00:02.093) 0:22:21.175 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Wednesday 18 March 2026 20:00:57 -0400 (0:00:00.407) 0:22:21.583 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Wednesday 18 March 2026 20:00:57 -0400 (0:00:00.451) 0:22:22.034 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Wednesday 18 March 2026 20:00:58 -0400 (0:00:00.513) 0:22:22.547 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Wednesday 18 March 2026 20:00:58 -0400 (0:00:00.491) 0:22:23.039 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Wednesday 18 March 2026 20:00:59 -0400 (0:00:00.435) 0:22:23.474 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Wednesday 18 March 2026 20:00:59 -0400 (0:00:00.317) 0:22:23.792 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Wednesday 18 March 2026 20:00:59 -0400 (0:00:00.295) 0:22:24.087 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-64ca6b26-d808-4080-846a-f5a544a68787 /dev/mapper/foo-test1 -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Wednesday 18 March 2026 20:01:00 -0400 (0:00:00.370) 0:22:24.458 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Wednesday 18 March 2026 20:01:00 -0400 (0:00:00.475) 0:22:24.933 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Wednesday 18 March 2026 20:01:01 -0400 (0:00:00.386) 0:22:25.319 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Wednesday 18 March 2026 20:01:01 -0400 (0:00:00.438) 0:22:25.757 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Wednesday 18 March 2026 20:01:02 -0400 (0:00:00.445) 0:22:26.202 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Wednesday 18 March 2026 20:01:02 -0400 (0:00:00.430) 0:22:26.633 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Wednesday 18 March 2026 20:01:02 -0400 (0:00:00.313) 0:22:26.947 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Wednesday 18 March 2026 20:01:03 -0400 (0:00:00.339) 0:22:27.286 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Wednesday 18 March 2026 20:01:03 -0400 (0:00:00.362) 0:22:27.649 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Wednesday 18 March 2026 20:01:03 -0400 (0:00:00.250) 0:22:27.900 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Wednesday 18 March 2026 20:01:04 -0400 (0:00:00.321) 0:22:28.221 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Wednesday 18 March 2026 20:01:04 -0400 (0:00:00.294) 0:22:28.515 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Wednesday 18 March 2026 20:01:04 -0400 (0:00:00.271) 0:22:28.787 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Wednesday 18 March 2026 20:01:05 -0400 (0:00:00.376) 0:22:29.164 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Wednesday 18 March 2026 20:01:05 -0400 (0:00:00.302) 0:22:29.467 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Wednesday 18 March 2026 20:01:05 -0400 (0:00:00.371) 0:22:29.838 ******* ok: [managed-node4] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Wednesday 18 March 2026 20:01:07 -0400 (0:00:01.788) 0:22:31.626 ******* ok: [managed-node4] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Wednesday 18 March 2026 20:01:09 -0400 (0:00:01.826) 0:22:33.452 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [Show expected size] ****************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Wednesday 18 March 2026 20:01:09 -0400 (0:00:00.449) 0:22:33.902 ******* ok: [managed-node4] => { "storage_test_expected_size": "4294967296" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Wednesday 18 March 2026 20:01:10 -0400 (0:00:00.273) 0:22:34.175 ******* ok: [managed-node4] => { "bytes": 10737418240, "changed": false, "lvm": "10g", "parted": "10GiB", "size": "10 GiB" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Wednesday 18 March 2026 20:01:11 -0400 (0:00:01.607) 0:22:35.782 ******* skipping: [managed-node4] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Wednesday 18 March 2026 20:01:12 -0400 (0:00:00.403) 0:22:36.186 ******* skipping: [managed-node4] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Wednesday 18 March 2026 20:01:12 -0400 (0:00:00.295) 0:22:36.482 ******* skipping: [managed-node4] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Wednesday 18 March 2026 20:01:12 -0400 (0:00:00.397) 0:22:36.879 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Wednesday 18 March 2026 20:01:13 -0400 (0:00:00.369) 0:22:37.249 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Wednesday 18 March 2026 20:01:13 -0400 (0:00:00.422) 0:22:37.671 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Wednesday 18 March 2026 20:01:13 -0400 (0:00:00.387) 0:22:38.059 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Wednesday 18 March 2026 20:01:14 -0400 (0:00:00.405) 0:22:38.465 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Wednesday 18 March 2026 20:01:14 -0400 (0:00:00.322) 0:22:38.787 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Wednesday 18 March 2026 20:01:15 -0400 (0:00:00.342) 0:22:39.129 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Wednesday 18 March 2026 20:01:15 -0400 (0:00:00.305) 0:22:39.435 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Wednesday 18 March 2026 20:01:15 -0400 (0:00:00.538) 0:22:39.973 ******* skipping: [managed-node4] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Wednesday 18 March 2026 20:01:16 -0400 (0:00:00.348) 0:22:40.322 ******* skipping: [managed-node4] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Wednesday 18 March 2026 20:01:16 -0400 (0:00:00.340) 0:22:40.663 ******* skipping: [managed-node4] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Wednesday 18 March 2026 20:01:16 -0400 (0:00:00.354) 0:22:41.017 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Wednesday 18 March 2026 20:01:17 -0400 (0:00:00.373) 0:22:41.390 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Wednesday 18 March 2026 20:01:17 -0400 (0:00:00.324) 0:22:41.714 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Wednesday 18 March 2026 20:01:17 -0400 (0:00:00.372) 0:22:42.087 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Wednesday 18 March 2026 20:01:18 -0400 (0:00:00.322) 0:22:42.410 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Wednesday 18 March 2026 20:01:18 -0400 (0:00:00.357) 0:22:42.767 ******* ok: [managed-node4] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Wednesday 18 March 2026 20:01:19 -0400 (0:00:00.340) 0:22:43.108 ******* ok: [managed-node4] => { "storage_test_expected_size": "4294967296" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Wednesday 18 March 2026 20:01:19 -0400 (0:00:00.268) 0:22:43.376 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Wednesday 18 March 2026 20:01:19 -0400 (0:00:00.321) 0:22:43.698 ******* ok: [managed-node4] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.019180", "end": "2026-03-18 20:01:21.248784", "rc": 0, "start": "2026-03-18 20:01:21.229604" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Wednesday 18 March 2026 20:01:21 -0400 (0:00:02.005) 0:22:45.703 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [Check segment type] ****************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Wednesday 18 March 2026 20:01:21 -0400 (0:00:00.324) 0:22:46.028 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Wednesday 18 March 2026 20:01:22 -0400 (0:00:00.427) 0:22:46.455 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Wednesday 18 March 2026 20:01:22 -0400 (0:00:00.398) 0:22:46.853 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Wednesday 18 March 2026 20:01:23 -0400 (0:00:00.282) 0:22:47.136 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Wednesday 18 March 2026 20:01:23 -0400 (0:00:00.351) 0:22:47.488 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Wednesday 18 March 2026 20:01:23 -0400 (0:00:00.379) 0:22:47.867 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Wednesday 18 March 2026 20:01:24 -0400 (0:00:00.309) 0:22:48.176 ******* TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Wednesday 18 March 2026 20:01:24 -0400 (0:00:00.333) 0:22:48.509 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Wednesday 18 March 2026 20:01:24 -0400 (0:00:00.309) 0:22:48.819 ******* changed: [managed-node4] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode - 5] ****************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:417 Wednesday 18 March 2026 20:01:26 -0400 (0:00:01.671) 0:22:50.490 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node4 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Wednesday 18 March 2026 20:01:26 -0400 (0:00:00.595) 0:22:51.086 ******* ok: [managed-node4] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Wednesday 18 March 2026 20:01:27 -0400 (0:00:00.412) 0:22:51.498 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node4 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:24 Wednesday 18 March 2026 20:01:27 -0400 (0:00:00.444) 0:22:51.942 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:34 Wednesday 18 March 2026 20:01:28 -0400 (0:00:00.296) 0:22:52.239 ******* TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Wednesday 18 March 2026 20:01:28 -0400 (0:00:00.289) 0:22:52.529 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Wednesday 18 March 2026 20:01:28 -0400 (0:00:00.425) 0:22:52.954 ******* ok: [managed-node4] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Wednesday 18 March 2026 20:01:32 -0400 (0:00:03.146) 0:22:56.101 ******* skipping: [managed-node4] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node4] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node4] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-loop", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed-node4] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Wednesday 18 March 2026 20:01:32 -0400 (0:00:00.816) 0:22:56.918 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Wednesday 18 March 2026 20:01:33 -0400 (0:00:00.370) 0:22:57.289 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Wednesday 18 March 2026 20:01:33 -0400 (0:00:00.281) 0:22:57.571 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Wednesday 18 March 2026 20:01:33 -0400 (0:00:00.295) 0:22:57.866 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Wednesday 18 March 2026 20:01:34 -0400 (0:00:00.265) 0:22:58.131 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Wednesday 18 March 2026 20:01:35 -0400 (0:00:01.084) 0:22:59.216 ******* ok: [managed-node4] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-loop-2.18-5.el7.x86_64 providing libblockdev-loop is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Wednesday 18 March 2026 20:01:39 -0400 (0:00:03.933) 0:23:03.149 ******* ok: [managed-node4] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": false, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Wednesday 18 March 2026 20:01:39 -0400 (0:00:00.366) 0:23:03.516 ******* ok: [managed-node4] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Wednesday 18 March 2026 20:01:39 -0400 (0:00:00.378) 0:23:03.894 ******* ok: [managed-node4] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Wednesday 18 March 2026 20:01:45 -0400 (0:00:05.995) 0:23:09.890 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Wednesday 18 March 2026 20:01:46 -0400 (0:00:00.506) 0:23:10.396 ******* TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Wednesday 18 March 2026 20:01:46 -0400 (0:00:00.208) 0:23:10.605 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Wednesday 18 March 2026 20:01:46 -0400 (0:00:00.300) 0:23:10.906 ******* TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Wednesday 18 March 2026 20:01:47 -0400 (0:00:00.253) 0:23:11.159 ******* ok: [managed-node4] => { "changed": false, "rc": 0, "results": [ "7:lvm2-2.02.187-6.el7_9.5.x86_64 providing lvm2 is already installed", "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Wednesday 18 March 2026 20:01:50 -0400 (0:00:03.827) 0:23:14.987 ******* ok: [managed-node4] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-cryptsetup@luks\\x2d64ca6b26\\x2dd808\\x2d4080\\x2d846a\\x2df5a544a68787.service": { "name": "systemd-cryptsetup@luks\\x2d64ca6b26\\x2dd808\\x2d4080\\x2d846a\\x2df5a544a68787.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Wednesday 18 March 2026 20:01:54 -0400 (0:00:03.211) 0:23:18.199 ******* changed: [managed-node4] => (item=systemd-cryptsetup@luks\x2d64ca6b26\x2dd808\x2d4080\x2d846a\x2df5a544a68787.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d64ca6b26\\x2dd808\\x2d4080\\x2d846a\\x2df5a544a68787.service", "name": "systemd-cryptsetup@luks\\x2d64ca6b26\\x2dd808\\x2d4080\\x2d846a\\x2df5a544a68787.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "system-systemd\\x2dcryptsetup.slice systemd-readahead-replay.service cryptsetup-pre.target dev-mapper-foo\\x2dtest1.device systemd-readahead-collect.service systemd-journald.socket", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "cryptsetup.target umount.target", "BindsTo": "dev-mapper-foo\\x2dtest1.device", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "Delegate": "no", "Description": "Cryptography Setup for luks-64ca6b26-d808-4080-846a-f5a544a68787", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-64ca6b26-d808-4080-846a-f5a544a68787 /dev/mapper/foo-test1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-64ca6b26-d808-4080-846a-f5a544a68787 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d64ca6b26\\x2dd808\\x2d4080\\x2d846a\\x2df5a544a68787.service", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2d64ca6b26\\x2dd808\\x2d4080\\x2d846a\\x2df5a544a68787.service", "IgnoreOnIsolate": "yes", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "4096", "LimitNPROC": "14311", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "14311", "LimitSTACK": "18446744073709551615", "LoadState": "loaded", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2d64ca6b26\\x2dd808\\x2d4080\\x2d846a\\x2df5a544a68787.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RequiredBy": "cryptsetup.target dev-mapper-luks\\x2d64ca6b26\\x2dd808\\x2d4080\\x2d846a\\x2df5a544a68787.device", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "0", "TimeoutStopUSec": "0", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "bad", "WantedBy": "dev-mapper-foo\\x2dtest1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Wednesday 18 March 2026 20:01:56 -0400 (0:00:02.335) 0:23:20.535 ******* fatal: [managed-node4]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'luks-64ca6b26-d808-4080-846a-f5a544a68787' in safe mode due to encryption removal TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111 Wednesday 18 March 2026 20:02:02 -0400 (0:00:06.042) 0:23:26.578 ******* fatal: [managed-node4]: FAILED! => { "changed": false } MSG: {u'_ansible_no_log': False, u'crypts': [], u'pools': [], u'leaves': [], u'changed': False, u'actions': [], u'failed': True, u'volumes': [], u'invocation': {u'module_args': {u'packages_only': False, u'uses_kmod_kvdo': True, u'disklabel_type': None, u'safe_mode': True, u'diskvolume_mkfs_option_map': {u'ext4': u'-F', u'ext3': u'-F', u'ext2': u'-F'}, u'pools': [{u'raid_metadata_version': None, u'encryption_key_size': None, u'encryption_key': None, u'encryption_luks_version': None, u'encryption_tang_url': None, u'raid_spare_count': None, u'grow_to_fill': False, u'encryption_tang_thumbprint': None, u'name': u'foo', u'encryption_password': None, u'encryption': False, u'disks': [u'sda'], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'volumes': [{u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': u'4g', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': u'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', u'encryption': False, u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'part_type': None, u'mount_mode': None, u'thin_pool_name': None, u'type': u'lvm', u'encryption_cipher': None, u'deduplication': None, u'vdo_pool_size': None, u'encryption_key_size': 0, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': u'luks1', u'raid_stripe_size': None, u'mount_passno': 0, u'mount_user': None, u'raid_spare_count': None, u'name': u'test1', u'cache_mode': None, u'raid_disks': [], u'mount_group': None, u'fs_overwrite_existing': True, u'disks': [u'sda'], u'cached': False, u'thin_pool_size': None, u'thin': False, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}], u'shared': False, u'encryption_clevis_pin': None, u'type': u'lvm', u'encryption_cipher': None, u'raid_chunk_size': None}], u'volumes': [], u'pool_defaults': {u'raid_metadata_version': None, u'encryption_cipher': None, u'encryption_key': None, u'encryption_luks_version': None, u'raid_spare_count': None, u'grow_to_fill': False, u'encryption_password': None, u'encryption': False, u'disks': [], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'shared': False, u'type': u'lvm', u'encryption_key_size': None, u'raid_chunk_size': None}, u'volume_defaults': {u'raid_metadata_version': None, u'raid_level': None, u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': 0, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'encryption': False, u'mount_device_identifier': u'uuid', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'thin_pool_name': None, u'type': u'lvm', u'encryption_key_size': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'raid_stripe_size': None, u'cache_size': 0, u'raid_spare_count': None, u'cache_mode': None, u'deduplication': None, u'cached': False, u'fs_overwrite_existing': True, u'disks': [], u'thin_pool_size': None, u'thin': None, u'mount_check': 0, u'mount_passno': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}, u'use_partitions': None}}, u'mounts': [], u'packages': [], u'msg': u"cannot remove existing formatting on device 'luks-64ca6b26-d808-4080-846a-f5a544a68787' in safe mode due to encryption removal"} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Wednesday 18 March 2026 20:02:03 -0400 (0:00:00.608) 0:23:27.186 ******* changed: [managed-node4] => (item=systemd-cryptsetup@luks\x2d64ca6b26\x2dd808\x2d4080\x2d846a\x2df5a544a68787.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d64ca6b26\\x2dd808\\x2d4080\\x2d846a\\x2df5a544a68787.service", "name": "systemd-cryptsetup@luks\\x2d64ca6b26\\x2dd808\\x2d4080\\x2d846a\\x2df5a544a68787.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ControlPID": "0", "DefaultDependencies": "yes", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d64ca6b26\\x2dd808\\x2d4080\\x2d846a\\x2df5a544a68787.service", "DevicePolicy": "auto", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2d64ca6b26\\x2dd808\\x2d4080\\x2d846a\\x2df5a544a68787.service", "IgnoreOnIsolate": "no", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "1048576", "LimitNPROC": "14311", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "14311", "LimitSTACK": "18446744073709551615", "LoadState": "masked", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2d64ca6b26\\x2dd808\\x2d4080\\x2d846a\\x2df5a544a68787.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RequiredBy": "cryptsetup.target dev-mapper-luks\\x2d64ca6b26\\x2dd808\\x2d4080\\x2d846a\\x2df5a544a68787.device", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UMask": "0022", "UnitFileState": "bad", "WantedBy": "dev-mapper-foo\\x2dtest1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:22 Wednesday 18 March 2026 20:02:05 -0400 (0:00:02.282) 0:23:29.469 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:27 Wednesday 18 March 2026 20:02:05 -0400 (0:00:00.480) 0:23:29.949 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:38 Wednesday 18 March 2026 20:02:06 -0400 (0:00:00.347) 0:23:30.297 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Wednesday 18 March 2026 20:02:06 -0400 (0:00:00.313) 0:23:30.611 ******* ok: [managed-node4] => { "changed": false, "stat": { "atime": 1773878486.1420324, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1773878486.1420324, "dev": 64769, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 67, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1773878486.1420324, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "18446744071794180263", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Wednesday 18 March 2026 20:02:08 -0400 (0:00:01.771) 0:23:32.383 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Remove the encryption layer - 3] ***************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:440 Wednesday 18 March 2026 20:02:08 -0400 (0:00:00.272) 0:23:32.656 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node4 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:24 Wednesday 18 March 2026 20:02:09 -0400 (0:00:00.496) 0:23:33.153 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:34 Wednesday 18 March 2026 20:02:09 -0400 (0:00:00.260) 0:23:33.413 ******* TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Wednesday 18 March 2026 20:02:09 -0400 (0:00:00.376) 0:23:33.790 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Wednesday 18 March 2026 20:02:09 -0400 (0:00:00.283) 0:23:34.074 ******* ok: [managed-node4] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Wednesday 18 March 2026 20:02:12 -0400 (0:00:02.963) 0:23:37.037 ******* skipping: [managed-node4] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node4] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node4] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-loop", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed-node4] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Wednesday 18 March 2026 20:02:13 -0400 (0:00:00.594) 0:23:37.632 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Wednesday 18 March 2026 20:02:13 -0400 (0:00:00.235) 0:23:37.868 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Wednesday 18 March 2026 20:02:14 -0400 (0:00:00.258) 0:23:38.126 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Wednesday 18 March 2026 20:02:14 -0400 (0:00:00.366) 0:23:38.492 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Wednesday 18 March 2026 20:02:14 -0400 (0:00:00.306) 0:23:38.798 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Wednesday 18 March 2026 20:02:15 -0400 (0:00:00.558) 0:23:39.357 ******* ok: [managed-node4] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-loop-2.18-5.el7.x86_64 providing libblockdev-loop is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Wednesday 18 March 2026 20:02:18 -0400 (0:00:02.799) 0:23:42.156 ******* ok: [managed-node4] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": false, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Wednesday 18 March 2026 20:02:18 -0400 (0:00:00.322) 0:23:42.479 ******* ok: [managed-node4] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Wednesday 18 March 2026 20:02:18 -0400 (0:00:00.284) 0:23:42.763 ******* ok: [managed-node4] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Wednesday 18 March 2026 20:02:24 -0400 (0:00:05.780) 0:23:48.544 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Wednesday 18 March 2026 20:02:24 -0400 (0:00:00.355) 0:23:48.899 ******* TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Wednesday 18 March 2026 20:02:25 -0400 (0:00:00.232) 0:23:49.132 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Wednesday 18 March 2026 20:02:25 -0400 (0:00:00.231) 0:23:49.364 ******* TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Wednesday 18 March 2026 20:02:25 -0400 (0:00:00.230) 0:23:49.595 ******* ok: [managed-node4] => { "changed": false, "rc": 0, "results": [ "7:lvm2-2.02.187-6.el7_9.5.x86_64 providing lvm2 is already installed", "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Wednesday 18 March 2026 20:02:28 -0400 (0:00:03.236) 0:23:52.832 ******* ok: [managed-node4] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "running", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "active", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-cryptsetup@luks\\x2d64ca6b26\\x2dd808\\x2d4080\\x2d846a\\x2df5a544a68787.service": { "name": "systemd-cryptsetup@luks\\x2d64ca6b26\\x2dd808\\x2d4080\\x2d846a\\x2df5a544a68787.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Wednesday 18 March 2026 20:02:31 -0400 (0:00:02.907) 0:23:55.739 ******* changed: [managed-node4] => (item=systemd-cryptsetup@luks\x2d64ca6b26\x2dd808\x2d4080\x2d846a\x2df5a544a68787.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d64ca6b26\\x2dd808\\x2d4080\\x2d846a\\x2df5a544a68787.service", "name": "systemd-cryptsetup@luks\\x2d64ca6b26\\x2dd808\\x2d4080\\x2d846a\\x2df5a544a68787.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "systemd-readahead-collect.service systemd-journald.socket cryptsetup-pre.target system-systemd\\x2dcryptsetup.slice systemd-readahead-replay.service dev-mapper-foo\\x2dtest1.device", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "cryptsetup.target umount.target", "BindsTo": "dev-mapper-foo\\x2dtest1.device", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "Delegate": "no", "Description": "Cryptography Setup for luks-64ca6b26-d808-4080-846a-f5a544a68787", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-64ca6b26-d808-4080-846a-f5a544a68787 /dev/mapper/foo-test1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-64ca6b26-d808-4080-846a-f5a544a68787 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d64ca6b26\\x2dd808\\x2d4080\\x2d846a\\x2df5a544a68787.service", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2d64ca6b26\\x2dd808\\x2d4080\\x2d846a\\x2df5a544a68787.service", "IgnoreOnIsolate": "yes", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "4096", "LimitNPROC": "14311", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "14311", "LimitSTACK": "18446744073709551615", "LoadState": "loaded", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2d64ca6b26\\x2dd808\\x2d4080\\x2d846a\\x2df5a544a68787.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RequiredBy": "cryptsetup.target dev-mapper-luks\\x2d64ca6b26\\x2dd808\\x2d4080\\x2d846a\\x2df5a544a68787.device", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "0", "TimeoutStopUSec": "0", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "bad", "WantedBy": "dev-mapper-foo\\x2dtest1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Wednesday 18 March 2026 20:02:34 -0400 (0:00:02.555) 0:23:58.295 ******* changed: [managed-node4] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-64ca6b26-d808-4080-846a-f5a544a68787", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-64ca6b26-d808-4080-846a-f5a544a68787", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-64ca6b26-d808-4080-846a-f5a544a68787", "password": "-", "state": "absent" } ], "leaves": [ "/dev/mapper/foo-test1", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-64ca6b26-d808-4080-846a-f5a544a68787", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "xfsprogs", "e2fsprogs", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 Wednesday 18 March 2026 20:03:40 -0400 (0:01:06.561) 0:25:04.856 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Wednesday 18 March 2026 20:03:41 -0400 (0:00:00.288) 0:25:05.144 ******* ok: [managed-node4] => { "changed": false, "stat": { "atime": 1773878230.3092973, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "c644d8564554e92249a45b7f7a4406e856ca6e88", "ctime": 1773878230.3062973, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263969, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1773878230.3062973, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1311, "uid": 0, "version": "18446744071680134064", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:97 Wednesday 18 March 2026 20:03:42 -0400 (0:00:01.816) 0:25:06.961 ******* ok: [managed-node4] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Wednesday 18 March 2026 20:03:44 -0400 (0:00:01.922) 0:25:08.883 ******* changed: [managed-node4] => (item=systemd-cryptsetup@luks\x2d64ca6b26\x2dd808\x2d4080\x2d846a\x2df5a544a68787.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d64ca6b26\\x2dd808\\x2d4080\\x2d846a\\x2df5a544a68787.service", "name": "systemd-cryptsetup@luks\\x2d64ca6b26\\x2dd808\\x2d4080\\x2d846a\\x2df5a544a68787.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ControlPID": "0", "DefaultDependencies": "yes", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d64ca6b26\\x2dd808\\x2d4080\\x2d846a\\x2df5a544a68787.service", "DevicePolicy": "auto", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2d64ca6b26\\x2dd808\\x2d4080\\x2d846a\\x2df5a544a68787.service", "IgnoreOnIsolate": "no", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "1048576", "LimitNPROC": "14311", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "14311", "LimitSTACK": "18446744073709551615", "LoadState": "masked", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2d64ca6b26\\x2dd808\\x2d4080\\x2d846a\\x2df5a544a68787.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RequiredBy": "cryptsetup.target dev-mapper-luks\\x2d64ca6b26\\x2dd808\\x2d4080\\x2d846a\\x2df5a544a68787.device", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UMask": "0022", "UnitFileState": "bad", "WantedBy": "dev-mapper-foo\\x2dtest1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Wednesday 18 March 2026 20:03:47 -0400 (0:00:02.565) 0:25:11.449 ******* ok: [managed-node4] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-64ca6b26-d808-4080-846a-f5a544a68787", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-64ca6b26-d808-4080-846a-f5a544a68787", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-64ca6b26-d808-4080-846a-f5a544a68787", "password": "-", "state": "absent" } ], "failed": false, "leaves": [ "/dev/mapper/foo-test1", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-64ca6b26-d808-4080-846a-f5a544a68787", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "xfsprogs", "e2fsprogs", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:130 Wednesday 18 March 2026 20:03:47 -0400 (0:00:00.384) 0:25:11.833 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:134 Wednesday 18 March 2026 20:03:48 -0400 (0:00:00.352) 0:25:12.186 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:150 Wednesday 18 March 2026 20:03:48 -0400 (0:00:00.456) 0:25:12.642 ******* changed: [managed-node4] => (item={u'src': u'/dev/mapper/luks-64ca6b26-d808-4080-846a-f5a544a68787', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-64ca6b26-d808-4080-846a-f5a544a68787", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-64ca6b26-d808-4080-846a-f5a544a68787" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Wednesday 18 March 2026 20:03:50 -0400 (0:00:01.912) 0:25:14.555 ******* ok: [managed-node4] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166 Wednesday 18 March 2026 20:03:52 -0400 (0:00:02.353) 0:25:16.908 ******* changed: [managed-node4] => (item={u'src': u'/dev/mapper/foo-test1', u'group': None, u'dump': 0, u'passno': 0, u'fstype': u'xfs', u'state': u'mounted', u'mode': None, u'owner': None, u'path': u'/opt/test1', u'opts': u'defaults'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:177 Wednesday 18 March 2026 20:03:55 -0400 (0:00:02.507) 0:25:19.416 ******* skipping: [managed-node4] => (item={u'src': u'/dev/mapper/foo-test1', u'group': None, u'dump': 0, u'passno': 0, u'fstype': u'xfs', u'state': u'mounted', u'mode': None, u'owner': None, u'path': u'/opt/test1', u'opts': u'defaults'}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:189 Wednesday 18 March 2026 20:03:55 -0400 (0:00:00.540) 0:25:19.957 ******* ok: [managed-node4] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 Wednesday 18 March 2026 20:03:58 -0400 (0:00:02.420) 0:25:22.377 ******* ok: [managed-node4] => { "changed": false, "stat": { "atime": 1773878249.8223534, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "7b0ef0712d0af9b50e580058dadf397ff61e2c96", "ctime": 1773878239.2583232, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 264034, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1773878239.2583232, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 66, "uid": 0, "version": "18446744071680136279", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:202 Wednesday 18 March 2026 20:04:00 -0400 (0:00:02.207) 0:25:24.586 ******* changed: [managed-node4] => (item={u'state': u'absent', u'password': u'-', u'name': u'luks-64ca6b26-d808-4080-846a-f5a544a68787', u'backing_device': u'/dev/mapper/foo-test1'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/mapper/foo-test1", "name": "luks-64ca6b26-d808-4080-846a-f5a544a68787", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 Wednesday 18 March 2026 20:04:02 -0400 (0:00:02.119) 0:25:26.705 ******* ok: [managed-node4] TASK [Verify role results - 9] ************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:455 Wednesday 18 March 2026 20:04:05 -0400 (0:00:02.808) 0:25:29.513 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node4 TASK [Print out pool information] ********************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Wednesday 18 March 2026 20:04:06 -0400 (0:00:00.828) 0:25:30.342 ******* ok: [managed-node4] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Wednesday 18 March 2026 20:04:06 -0400 (0:00:00.484) 0:25:30.826 ******* skipping: [managed-node4] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Wednesday 18 March 2026 20:04:07 -0400 (0:00:00.377) 0:25:31.203 ******* ok: [managed-node4] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "514b4b6b-9c36-4922-b013-64ad1942a345" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "qrc5qj-1or5-aUH1-hq0R-233J-AQ7d-XEcSEK" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "ext4", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "c7b7d6a5-fd01-4b9b-bcca-153eaff9d312" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Wednesday 18 March 2026 20:04:08 -0400 (0:00:01.857) 0:25:33.061 ******* ok: [managed-node4] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002632", "end": "2026-03-18 20:04:10.811762", "rc": 0, "start": "2026-03-18 20:04:10.809130" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Thu Jun 20 10:23:46 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk' # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info # UUID=c7b7d6a5-fd01-4b9b-bcca-153eaff9d312 / ext4 defaults 1 1 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/foo-test1 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Wednesday 18 March 2026 20:04:11 -0400 (0:00:02.287) 0:25:35.348 ******* ok: [managed-node4] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002665", "end": "2026-03-18 20:04:12.755294", "failed_when_result": false, "rc": 0, "start": "2026-03-18 20:04:12.752629" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Wednesday 18 March 2026 20:04:13 -0400 (0:00:01.800) 0:25:37.148 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node4 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Wednesday 18 March 2026 20:04:13 -0400 (0:00:00.450) 0:25:37.598 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Wednesday 18 March 2026 20:04:13 -0400 (0:00:00.341) 0:25:37.940 ******* ok: [managed-node4] => { "changed": false, "cmd": [ "vgs", "--noheadings", "--binary", "-o", "shared", "foo" ], "delta": "0:00:00.017690", "end": "2026-03-18 20:04:15.308709", "rc": 0, "start": "2026-03-18 20:04:15.291019" } STDOUT: 0 TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Wednesday 18 March 2026 20:04:15 -0400 (0:00:01.766) 0:25:39.706 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Wednesday 18 March 2026 20:04:15 -0400 (0:00:00.289) 0:25:39.995 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node4 included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node4 TASK [Set test variables] ****************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Wednesday 18 March 2026 20:04:16 -0400 (0:00:00.549) 0:25:40.545 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Wednesday 18 March 2026 20:04:16 -0400 (0:00:00.335) 0:25:40.881 ******* ok: [managed-node4] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Wednesday 18 March 2026 20:04:18 -0400 (0:00:01.634) 0:25:42.515 ******* ok: [managed-node4] => { "ansible_facts": { "__pvs_lvm_len": "1" }, "changed": false } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Wednesday 18 March 2026 20:04:18 -0400 (0:00:00.289) 0:25:42.805 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "changed": false } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Wednesday 18 March 2026 20:04:19 -0400 (0:00:00.381) 0:25:43.187 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Set expected pv type] **************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Wednesday 18 March 2026 20:04:19 -0400 (0:00:00.377) 0:25:43.564 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type - 2] ************************************************ task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Wednesday 18 March 2026 20:04:19 -0400 (0:00:00.289) 0:25:43.853 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type - 3] ************************************************ task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Wednesday 18 March 2026 20:04:19 -0400 (0:00:00.201) 0:25:44.054 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:55 Wednesday 18 March 2026 20:04:20 -0400 (0:00:00.188) 0:25:44.243 ******* ok: [managed-node4] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:68 Wednesday 18 March 2026 20:04:20 -0400 (0:00:00.287) 0:25:44.530 ******* ok: [managed-node4] => { "changed": false, "failed_when_result": false, "rc": 1 } STDERR: Shared connection to 10.31.44.104 closed. MSG: non-zero return code TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:78 Wednesday 18 March 2026 20:04:21 -0400 (0:00:01.370) 0:25:45.900 ******* skipping: [managed-node4] => (item=/dev/sda) => { "ansible_loop_var": "st_pool_pv", "changed": false, "skip_reason": "Conditional result was False", "st_pool_pv": "/dev/sda" } TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:88 Wednesday 18 March 2026 20:04:21 -0400 (0:00:00.167) 0:25:46.068 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node4 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Wednesday 18 March 2026 20:04:22 -0400 (0:00:00.477) 0:25:46.545 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Wednesday 18 March 2026 20:04:22 -0400 (0:00:00.191) 0:25:46.737 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Wednesday 18 March 2026 20:04:22 -0400 (0:00:00.327) 0:25:47.064 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Wednesday 18 March 2026 20:04:23 -0400 (0:00:00.234) 0:25:47.299 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Wednesday 18 March 2026 20:04:23 -0400 (0:00:00.203) 0:25:47.502 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Wednesday 18 March 2026 20:04:23 -0400 (0:00:00.176) 0:25:47.678 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Wednesday 18 March 2026 20:04:24 -0400 (0:00:00.427) 0:25:48.106 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Wednesday 18 March 2026 20:04:24 -0400 (0:00:00.099) 0:25:48.206 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Wednesday 18 March 2026 20:04:24 -0400 (0:00:00.173) 0:25:48.379 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Wednesday 18 March 2026 20:04:24 -0400 (0:00:00.159) 0:25:48.539 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Wednesday 18 March 2026 20:04:24 -0400 (0:00:00.072) 0:25:48.611 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:91 Wednesday 18 March 2026 20:04:24 -0400 (0:00:00.098) 0:25:48.710 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node4 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Wednesday 18 March 2026 20:04:24 -0400 (0:00:00.236) 0:25:48.946 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml for managed-node4 TASK [Get information about the LV] ******************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:8 Wednesday 18 March 2026 20:04:25 -0400 (0:00:00.403) 0:25:49.349 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:16 Wednesday 18 March 2026 20:04:25 -0400 (0:00:00.165) 0:25:49.515 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:20 Wednesday 18 March 2026 20:04:25 -0400 (0:00:00.104) 0:25:49.619 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV stripe size] ****************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:27 Wednesday 18 March 2026 20:04:25 -0400 (0:00:00.215) 0:25:49.834 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested stripe size] ***************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:31 Wednesday 18 March 2026 20:04:25 -0400 (0:00:00.155) 0:25:49.990 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected stripe size] ************************************************ task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:37 Wednesday 18 March 2026 20:04:26 -0400 (0:00:00.205) 0:25:50.195 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check stripe size] ******************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:42 Wednesday 18 March 2026 20:04:26 -0400 (0:00:00.342) 0:25:50.538 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:94 Wednesday 18 March 2026 20:04:26 -0400 (0:00:00.217) 0:25:50.755 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node4 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Wednesday 18 March 2026 20:04:27 -0400 (0:00:00.426) 0:25:51.182 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml for managed-node4 TASK [Get information about thinpool] ****************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:8 Wednesday 18 March 2026 20:04:27 -0400 (0:00:00.531) 0:25:51.714 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in correct thinpool (when thinp name is provided)] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:16 Wednesday 18 March 2026 20:04:27 -0400 (0:00:00.157) 0:25:51.871 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in thinpool (when thinp name is not provided)] ****** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:22 Wednesday 18 March 2026 20:04:28 -0400 (0:00:00.240) 0:25:52.112 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:26 Wednesday 18 March 2026 20:04:28 -0400 (0:00:00.201) 0:25:52.313 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_thin_status": null }, "changed": false } TASK [Check member encryption] ************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:97 Wednesday 18 March 2026 20:04:28 -0400 (0:00:00.178) 0:25:52.491 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node4 TASK [Set test variables] ****************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Wednesday 18 March 2026 20:04:28 -0400 (0:00:00.550) 0:25:53.042 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Wednesday 18 March 2026 20:04:29 -0400 (0:00:00.343) 0:25:53.386 ******* skipping: [managed-node4] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Wednesday 18 March 2026 20:04:29 -0400 (0:00:00.330) 0:25:53.716 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml for managed-node4 TASK [Set variables used by tests] ********************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:2 Wednesday 18 March 2026 20:04:30 -0400 (0:00:00.615) 0:25:54.331 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:6 Wednesday 18 March 2026 20:04:30 -0400 (0:00:00.259) 0:25:54.591 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:14 Wednesday 18 March 2026 20:04:30 -0400 (0:00:00.324) 0:25:54.915 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:23 Wednesday 18 March 2026 20:04:31 -0400 (0:00:00.363) 0:25:55.278 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:32 Wednesday 18 March 2026 20:04:31 -0400 (0:00:00.314) 0:25:55.593 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:41 Wednesday 18 March 2026 20:04:31 -0400 (0:00:00.324) 0:25:55.917 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [Clear test variables] **************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Wednesday 18 March 2026 20:04:32 -0400 (0:00:00.280) 0:25:56.197 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:100 Wednesday 18 March 2026 20:04:32 -0400 (0:00:00.209) 0:25:56.406 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node4 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Wednesday 18 March 2026 20:04:32 -0400 (0:00:00.654) 0:25:57.061 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml for managed-node4 TASK [Get information about VDO deduplication] ********************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:8 Wednesday 18 March 2026 20:04:33 -0400 (0:00:00.644) 0:25:57.706 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:15 Wednesday 18 March 2026 20:04:34 -0400 (0:00:00.406) 0:25:58.112 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:21 Wednesday 18 March 2026 20:04:34 -0400 (0:00:00.419) 0:25:58.532 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about VDO compression] *********************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:27 Wednesday 18 March 2026 20:04:34 -0400 (0:00:00.380) 0:25:58.912 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off - 2] *********************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:34 Wednesday 18 March 2026 20:04:35 -0400 (0:00:00.260) 0:25:59.173 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on - 2] ************************************ task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:40 Wednesday 18 March 2026 20:04:35 -0400 (0:00:00.292) 0:25:59.466 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:46 Wednesday 18 March 2026 20:04:35 -0400 (0:00:00.224) 0:25:59.691 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:103 Wednesday 18 March 2026 20:04:35 -0400 (0:00:00.221) 0:25:59.912 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node4 TASK [Get stratis pool information] ******************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Wednesday 18 March 2026 20:04:36 -0400 (0:00:00.782) 0:26:00.695 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print script output] ***************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Wednesday 18 March 2026 20:04:36 -0400 (0:00:00.263) 0:26:00.959 ******* skipping: [managed-node4] => {} TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:19 Wednesday 18 March 2026 20:04:37 -0400 (0:00:00.222) 0:26:01.182 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:23 Wednesday 18 March 2026 20:04:37 -0400 (0:00:00.401) 0:26:01.583 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:30 Wednesday 18 March 2026 20:04:37 -0400 (0:00:00.360) 0:26:01.944 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:39 Wednesday 18 March 2026 20:04:38 -0400 (0:00:00.244) 0:26:02.188 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:49 Wednesday 18 March 2026 20:04:38 -0400 (0:00:00.281) 0:26:02.470 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:106 Wednesday 18 March 2026 20:04:38 -0400 (0:00:00.403) 0:26:02.874 ******* ok: [managed-node4] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Wednesday 18 March 2026 20:04:39 -0400 (0:00:00.331) 0:26:03.206 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node4 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Wednesday 18 March 2026 20:04:39 -0400 (0:00:00.546) 0:26:03.753 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Wednesday 18 March 2026 20:04:40 -0400 (0:00:00.386) 0:26:04.139 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node4 included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node4 included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node4 included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node4 included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node4 included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node4 included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node4 included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node4 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Wednesday 18 March 2026 20:04:42 -0400 (0:00:02.082) 0:26:06.221 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Wednesday 18 March 2026 20:04:42 -0400 (0:00:00.411) 0:26:06.633 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Wednesday 18 March 2026 20:04:43 -0400 (0:00:00.475) 0:26:07.108 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Wednesday 18 March 2026 20:04:43 -0400 (0:00:00.387) 0:26:07.495 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Wednesday 18 March 2026 20:04:43 -0400 (0:00:00.353) 0:26:07.849 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Wednesday 18 March 2026 20:04:44 -0400 (0:00:00.313) 0:26:08.162 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Wednesday 18 March 2026 20:04:44 -0400 (0:00:00.478) 0:26:08.640 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Wednesday 18 March 2026 20:04:44 -0400 (0:00:00.413) 0:26:09.054 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Wednesday 18 March 2026 20:04:45 -0400 (0:00:00.390) 0:26:09.444 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Wednesday 18 March 2026 20:04:45 -0400 (0:00:00.360) 0:26:09.805 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Wednesday 18 March 2026 20:04:45 -0400 (0:00:00.289) 0:26:10.094 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Wednesday 18 March 2026 20:04:46 -0400 (0:00:00.171) 0:26:10.266 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/foo-test1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Wednesday 18 March 2026 20:04:46 -0400 (0:00:00.527) 0:26:10.794 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Wednesday 18 March 2026 20:04:46 -0400 (0:00:00.292) 0:26:11.086 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Wednesday 18 March 2026 20:04:47 -0400 (0:00:00.322) 0:26:11.409 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Wednesday 18 March 2026 20:04:48 -0400 (0:00:01.533) 0:26:12.943 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Wednesday 18 March 2026 20:04:49 -0400 (0:00:00.296) 0:26:13.239 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Wednesday 18 March 2026 20:04:49 -0400 (0:00:00.312) 0:26:13.552 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Wednesday 18 March 2026 20:04:49 -0400 (0:00:00.473) 0:26:14.026 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Wednesday 18 March 2026 20:04:50 -0400 (0:00:00.425) 0:26:14.451 ******* ok: [managed-node4] => { "changed": false, "stat": { "atime": 1773878620.229415, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1773878620.229415, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 190946, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1773878620.229415, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Wednesday 18 March 2026 20:04:52 -0400 (0:00:01.700) 0:26:16.152 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Wednesday 18 March 2026 20:04:52 -0400 (0:00:00.459) 0:26:16.612 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Wednesday 18 March 2026 20:04:52 -0400 (0:00:00.388) 0:26:17.001 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Wednesday 18 March 2026 20:04:53 -0400 (0:00:00.276) 0:26:17.278 ******* ok: [managed-node4] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Wednesday 18 March 2026 20:04:53 -0400 (0:00:00.311) 0:26:17.590 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Wednesday 18 March 2026 20:04:53 -0400 (0:00:00.283) 0:26:17.873 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Wednesday 18 March 2026 20:04:54 -0400 (0:00:00.277) 0:26:18.151 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Wednesday 18 March 2026 20:04:54 -0400 (0:00:00.307) 0:26:18.458 ******* ok: [managed-node4] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed" ] } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Wednesday 18 March 2026 20:04:58 -0400 (0:00:03.701) 0:26:22.160 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Wednesday 18 March 2026 20:04:58 -0400 (0:00:00.281) 0:26:22.441 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Wednesday 18 March 2026 20:04:58 -0400 (0:00:00.353) 0:26:22.794 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Wednesday 18 March 2026 20:04:59 -0400 (0:00:00.563) 0:26:23.358 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Wednesday 18 March 2026 20:04:59 -0400 (0:00:00.338) 0:26:23.696 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Wednesday 18 March 2026 20:04:59 -0400 (0:00:00.255) 0:26:23.951 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Wednesday 18 March 2026 20:05:00 -0400 (0:00:00.327) 0:26:24.279 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Wednesday 18 March 2026 20:05:00 -0400 (0:00:00.335) 0:26:24.615 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Wednesday 18 March 2026 20:05:00 -0400 (0:00:00.444) 0:26:25.059 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Wednesday 18 March 2026 20:05:01 -0400 (0:00:00.432) 0:26:25.491 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Wednesday 18 March 2026 20:05:01 -0400 (0:00:00.398) 0:26:25.890 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Wednesday 18 March 2026 20:05:02 -0400 (0:00:00.253) 0:26:26.143 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Wednesday 18 March 2026 20:05:02 -0400 (0:00:00.356) 0:26:26.500 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Wednesday 18 March 2026 20:05:02 -0400 (0:00:00.301) 0:26:26.802 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Wednesday 18 March 2026 20:05:02 -0400 (0:00:00.190) 0:26:26.993 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Wednesday 18 March 2026 20:05:03 -0400 (0:00:00.208) 0:26:27.201 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Wednesday 18 March 2026 20:05:03 -0400 (0:00:00.271) 0:26:27.472 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Wednesday 18 March 2026 20:05:03 -0400 (0:00:00.396) 0:26:27.869 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Wednesday 18 March 2026 20:05:04 -0400 (0:00:00.262) 0:26:28.131 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Wednesday 18 March 2026 20:05:04 -0400 (0:00:00.406) 0:26:28.538 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Wednesday 18 March 2026 20:05:04 -0400 (0:00:00.401) 0:26:28.940 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Wednesday 18 March 2026 20:05:05 -0400 (0:00:00.339) 0:26:29.280 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Wednesday 18 March 2026 20:05:05 -0400 (0:00:00.340) 0:26:29.621 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Wednesday 18 March 2026 20:05:05 -0400 (0:00:00.296) 0:26:29.917 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Wednesday 18 March 2026 20:05:06 -0400 (0:00:00.461) 0:26:30.379 ******* ok: [managed-node4] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Wednesday 18 March 2026 20:05:08 -0400 (0:00:01.825) 0:26:32.204 ******* ok: [managed-node4] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Wednesday 18 March 2026 20:05:10 -0400 (0:00:01.970) 0:26:34.175 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [Show expected size] ****************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Wednesday 18 March 2026 20:05:10 -0400 (0:00:00.465) 0:26:34.640 ******* ok: [managed-node4] => { "storage_test_expected_size": "4294967296" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Wednesday 18 March 2026 20:05:10 -0400 (0:00:00.299) 0:26:34.940 ******* ok: [managed-node4] => { "bytes": 10737418240, "changed": false, "lvm": "10g", "parted": "10GiB", "size": "10 GiB" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Wednesday 18 March 2026 20:05:12 -0400 (0:00:01.962) 0:26:36.902 ******* skipping: [managed-node4] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Wednesday 18 March 2026 20:05:13 -0400 (0:00:00.193) 0:26:37.095 ******* skipping: [managed-node4] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Wednesday 18 March 2026 20:05:13 -0400 (0:00:00.072) 0:26:37.168 ******* skipping: [managed-node4] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Wednesday 18 March 2026 20:05:13 -0400 (0:00:00.526) 0:26:37.694 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Wednesday 18 March 2026 20:05:14 -0400 (0:00:00.418) 0:26:38.112 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Wednesday 18 March 2026 20:05:14 -0400 (0:00:00.393) 0:26:38.506 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Wednesday 18 March 2026 20:05:14 -0400 (0:00:00.337) 0:26:38.844 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Wednesday 18 March 2026 20:05:15 -0400 (0:00:00.362) 0:26:39.206 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Wednesday 18 March 2026 20:05:15 -0400 (0:00:00.333) 0:26:39.540 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Wednesday 18 March 2026 20:05:15 -0400 (0:00:00.443) 0:26:39.983 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Wednesday 18 March 2026 20:05:16 -0400 (0:00:00.321) 0:26:40.305 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Wednesday 18 March 2026 20:05:16 -0400 (0:00:00.378) 0:26:40.683 ******* skipping: [managed-node4] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Wednesday 18 March 2026 20:05:16 -0400 (0:00:00.365) 0:26:41.049 ******* skipping: [managed-node4] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Wednesday 18 March 2026 20:05:17 -0400 (0:00:00.342) 0:26:41.391 ******* skipping: [managed-node4] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Wednesday 18 March 2026 20:05:17 -0400 (0:00:00.289) 0:26:41.681 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Wednesday 18 March 2026 20:05:17 -0400 (0:00:00.328) 0:26:42.010 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Wednesday 18 March 2026 20:05:18 -0400 (0:00:00.286) 0:26:42.296 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Wednesday 18 March 2026 20:05:18 -0400 (0:00:00.304) 0:26:42.601 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Wednesday 18 March 2026 20:05:18 -0400 (0:00:00.242) 0:26:42.843 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Wednesday 18 March 2026 20:05:19 -0400 (0:00:00.312) 0:26:43.155 ******* ok: [managed-node4] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Wednesday 18 March 2026 20:05:19 -0400 (0:00:00.413) 0:26:43.569 ******* ok: [managed-node4] => { "storage_test_expected_size": "4294967296" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Wednesday 18 March 2026 20:05:19 -0400 (0:00:00.270) 0:26:43.839 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Wednesday 18 March 2026 20:05:20 -0400 (0:00:00.538) 0:26:44.378 ******* ok: [managed-node4] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.020205", "end": "2026-03-18 20:05:21.634135", "rc": 0, "start": "2026-03-18 20:05:21.613930" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Wednesday 18 March 2026 20:05:22 -0400 (0:00:01.725) 0:26:46.103 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [Check segment type] ****************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Wednesday 18 March 2026 20:05:22 -0400 (0:00:00.364) 0:26:46.468 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Wednesday 18 March 2026 20:05:22 -0400 (0:00:00.485) 0:26:46.953 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Wednesday 18 March 2026 20:05:23 -0400 (0:00:00.281) 0:26:47.234 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Wednesday 18 March 2026 20:05:23 -0400 (0:00:00.246) 0:26:47.480 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Wednesday 18 March 2026 20:05:23 -0400 (0:00:00.247) 0:26:47.728 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Wednesday 18 March 2026 20:05:23 -0400 (0:00:00.256) 0:26:47.985 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Wednesday 18 March 2026 20:05:24 -0400 (0:00:00.286) 0:26:48.271 ******* TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Wednesday 18 March 2026 20:05:24 -0400 (0:00:00.176) 0:26:48.447 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Wednesday 18 March 2026 20:05:24 -0400 (0:00:00.327) 0:26:48.775 ******* changed: [managed-node4] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode - 6] ****************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:461 Wednesday 18 March 2026 20:05:26 -0400 (0:00:01.685) 0:26:50.461 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node4 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Wednesday 18 March 2026 20:05:27 -0400 (0:00:00.718) 0:26:51.179 ******* ok: [managed-node4] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Wednesday 18 March 2026 20:05:27 -0400 (0:00:00.421) 0:26:51.601 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node4 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:24 Wednesday 18 March 2026 20:05:28 -0400 (0:00:00.560) 0:26:52.161 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:34 Wednesday 18 March 2026 20:05:28 -0400 (0:00:00.233) 0:26:52.394 ******* TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Wednesday 18 March 2026 20:05:28 -0400 (0:00:00.450) 0:26:52.845 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Wednesday 18 March 2026 20:05:29 -0400 (0:00:00.418) 0:26:53.264 ******* ok: [managed-node4] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Wednesday 18 March 2026 20:05:32 -0400 (0:00:03.376) 0:26:56.640 ******* skipping: [managed-node4] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node4] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node4] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-loop", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed-node4] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Wednesday 18 March 2026 20:05:33 -0400 (0:00:00.788) 0:26:57.428 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Wednesday 18 March 2026 20:05:33 -0400 (0:00:00.373) 0:26:57.801 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Wednesday 18 March 2026 20:05:33 -0400 (0:00:00.286) 0:26:58.088 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Wednesday 18 March 2026 20:05:34 -0400 (0:00:00.327) 0:26:58.416 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Wednesday 18 March 2026 20:05:34 -0400 (0:00:00.393) 0:26:58.809 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Wednesday 18 March 2026 20:05:35 -0400 (0:00:00.806) 0:26:59.616 ******* ok: [managed-node4] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-loop-2.18-5.el7.x86_64 providing libblockdev-loop is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Wednesday 18 March 2026 20:05:38 -0400 (0:00:03.412) 0:27:03.028 ******* ok: [managed-node4] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Wednesday 18 March 2026 20:05:39 -0400 (0:00:00.449) 0:27:03.478 ******* ok: [managed-node4] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Wednesday 18 March 2026 20:05:39 -0400 (0:00:00.378) 0:27:03.856 ******* ok: [managed-node4] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup", "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Wednesday 18 March 2026 20:05:45 -0400 (0:00:05.937) 0:27:09.794 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Wednesday 18 March 2026 20:05:46 -0400 (0:00:00.528) 0:27:10.323 ******* TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Wednesday 18 March 2026 20:05:46 -0400 (0:00:00.225) 0:27:10.548 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Wednesday 18 March 2026 20:05:46 -0400 (0:00:00.350) 0:27:10.899 ******* TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Wednesday 18 March 2026 20:05:47 -0400 (0:00:00.368) 0:27:11.267 ******* ok: [managed-node4] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed", "7:lvm2-2.02.187-6.el7_9.5.x86_64 providing lvm2 is already installed", "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Wednesday 18 March 2026 20:05:50 -0400 (0:00:03.689) 0:27:14.957 ******* ok: [managed-node4] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-cryptsetup@luks\\x2d64ca6b26\\x2dd808\\x2d4080\\x2d846a\\x2df5a544a68787.service": { "name": "systemd-cryptsetup@luks\\x2d64ca6b26\\x2dd808\\x2d4080\\x2d846a\\x2df5a544a68787.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Wednesday 18 March 2026 20:05:53 -0400 (0:00:03.100) 0:27:18.057 ******* changed: [managed-node4] => (item=systemd-cryptsetup@luks\x2d64ca6b26\x2dd808\x2d4080\x2d846a\x2df5a544a68787.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d64ca6b26\\x2dd808\\x2d4080\\x2d846a\\x2df5a544a68787.service", "name": "systemd-cryptsetup@luks\\x2d64ca6b26\\x2dd808\\x2d4080\\x2d846a\\x2df5a544a68787.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "system-systemd\\x2dcryptsetup.slice cryptsetup-pre.target dev-mapper-foo\\x2dtest1.device systemd-readahead-collect.service systemd-journald.socket systemd-readahead-replay.service", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "umount.target cryptsetup.target", "BindsTo": "dev-mapper-foo\\x2dtest1.device", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "Delegate": "no", "Description": "Cryptography Setup for luks-64ca6b26-d808-4080-846a-f5a544a68787", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-64ca6b26-d808-4080-846a-f5a544a68787 /dev/mapper/foo-test1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-64ca6b26-d808-4080-846a-f5a544a68787 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d64ca6b26\\x2dd808\\x2d4080\\x2d846a\\x2df5a544a68787.service", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2d64ca6b26\\x2dd808\\x2d4080\\x2d846a\\x2df5a544a68787.service", "IgnoreOnIsolate": "yes", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "4096", "LimitNPROC": "14311", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "14311", "LimitSTACK": "18446744073709551615", "LoadState": "loaded", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2d64ca6b26\\x2dd808\\x2d4080\\x2d846a\\x2df5a544a68787.service", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RequiredBy": "cryptsetup.target", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "0", "TimeoutStopUSec": "0", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "bad", "WantedBy": "dev-mapper-foo\\x2dtest1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Wednesday 18 March 2026 20:05:56 -0400 (0:00:02.397) 0:27:20.455 ******* fatal: [managed-node4]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'test1' in safe mode due to adding encryption TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111 Wednesday 18 March 2026 20:06:02 -0400 (0:00:05.873) 0:27:26.329 ******* fatal: [managed-node4]: FAILED! => { "changed": false } MSG: {u'_ansible_no_log': False, u'crypts': [], u'pools': [], u'leaves': [], u'changed': False, u'actions': [], u'failed': True, u'volumes': [], u'invocation': {u'module_args': {u'packages_only': False, u'uses_kmod_kvdo': True, u'disklabel_type': None, u'safe_mode': True, u'diskvolume_mkfs_option_map': {u'ext4': u'-F', u'ext3': u'-F', u'ext2': u'-F'}, u'pools': [{u'raid_metadata_version': None, u'encryption_key_size': None, u'encryption_key': None, u'encryption_luks_version': None, u'encryption_tang_url': None, u'raid_spare_count': None, u'grow_to_fill': False, u'encryption_tang_thumbprint': None, u'name': u'foo', u'encryption_password': None, u'encryption': False, u'disks': [u'sda'], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'volumes': [{u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': u'4g', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': u'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', u'encryption': True, u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'part_type': None, u'mount_mode': None, u'thin_pool_name': None, u'type': u'lvm', u'encryption_cipher': None, u'deduplication': None, u'vdo_pool_size': None, u'encryption_key_size': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'raid_stripe_size': None, u'mount_passno': 0, u'mount_user': None, u'raid_spare_count': None, u'name': u'test1', u'cache_mode': None, u'raid_disks': [], u'mount_group': None, u'fs_overwrite_existing': True, u'disks': [u'sda'], u'cached': False, u'thin_pool_size': None, u'thin': False, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}], u'shared': False, u'encryption_clevis_pin': None, u'type': u'lvm', u'encryption_cipher': None, u'raid_chunk_size': None}], u'volumes': [], u'pool_defaults': {u'raid_metadata_version': None, u'encryption_cipher': None, u'encryption_key': None, u'encryption_luks_version': None, u'raid_spare_count': None, u'grow_to_fill': False, u'encryption_password': None, u'encryption': False, u'disks': [], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'shared': False, u'type': u'lvm', u'encryption_key_size': None, u'raid_chunk_size': None}, u'volume_defaults': {u'raid_metadata_version': None, u'raid_level': None, u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': 0, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'encryption': False, u'mount_device_identifier': u'uuid', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'thin_pool_name': None, u'type': u'lvm', u'encryption_key_size': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'raid_stripe_size': None, u'cache_size': 0, u'raid_spare_count': None, u'cache_mode': None, u'deduplication': None, u'cached': False, u'fs_overwrite_existing': True, u'disks': [], u'thin_pool_size': None, u'thin': None, u'mount_check': 0, u'mount_passno': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}, u'use_partitions': None}}, u'mounts': [], u'packages': [], u'msg': u"cannot remove existing formatting on device 'test1' in safe mode due to adding encryption"} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Wednesday 18 March 2026 20:06:02 -0400 (0:00:00.479) 0:27:26.808 ******* changed: [managed-node4] => (item=systemd-cryptsetup@luks\x2d64ca6b26\x2dd808\x2d4080\x2d846a\x2df5a544a68787.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d64ca6b26\\x2dd808\\x2d4080\\x2d846a\\x2df5a544a68787.service", "name": "systemd-cryptsetup@luks\\x2d64ca6b26\\x2dd808\\x2d4080\\x2d846a\\x2df5a544a68787.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ControlPID": "0", "DefaultDependencies": "yes", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d64ca6b26\\x2dd808\\x2d4080\\x2d846a\\x2df5a544a68787.service", "DevicePolicy": "auto", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2d64ca6b26\\x2dd808\\x2d4080\\x2d846a\\x2df5a544a68787.service", "IgnoreOnIsolate": "no", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "1048576", "LimitNPROC": "14311", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "14311", "LimitSTACK": "18446744073709551615", "LoadState": "masked", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2d64ca6b26\\x2dd808\\x2d4080\\x2d846a\\x2df5a544a68787.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UMask": "0022", "UnitFileState": "bad", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:22 Wednesday 18 March 2026 20:06:05 -0400 (0:00:02.468) 0:27:29.276 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:27 Wednesday 18 March 2026 20:06:05 -0400 (0:00:00.565) 0:27:29.842 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:38 Wednesday 18 March 2026 20:06:06 -0400 (0:00:00.485) 0:27:30.327 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Wednesday 18 March 2026 20:06:06 -0400 (0:00:00.278) 0:27:30.605 ******* ok: [managed-node4] => { "changed": false, "stat": { "atime": 1773878726.1037169, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1773878726.1037169, "dev": 64768, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 67, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1773878726.1037169, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "18446744071631533454", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Wednesday 18 March 2026 20:06:08 -0400 (0:00:01.651) 0:27:32.257 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Add encryption to the volume - 3] **************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:484 Wednesday 18 March 2026 20:06:08 -0400 (0:00:00.381) 0:27:32.639 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node4 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:24 Wednesday 18 March 2026 20:06:09 -0400 (0:00:00.689) 0:27:33.329 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:34 Wednesday 18 March 2026 20:06:09 -0400 (0:00:00.307) 0:27:33.636 ******* TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Wednesday 18 March 2026 20:06:09 -0400 (0:00:00.308) 0:27:33.945 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Wednesday 18 March 2026 20:06:10 -0400 (0:00:00.429) 0:27:34.374 ******* ok: [managed-node4] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Wednesday 18 March 2026 20:06:13 -0400 (0:00:03.095) 0:27:37.469 ******* skipping: [managed-node4] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node4] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node4] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-loop", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed-node4] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Wednesday 18 March 2026 20:06:14 -0400 (0:00:00.652) 0:27:38.122 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Wednesday 18 March 2026 20:06:14 -0400 (0:00:00.412) 0:27:38.534 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Wednesday 18 March 2026 20:06:14 -0400 (0:00:00.345) 0:27:38.879 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Wednesday 18 March 2026 20:06:15 -0400 (0:00:00.272) 0:27:39.152 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Wednesday 18 March 2026 20:06:15 -0400 (0:00:00.222) 0:27:39.374 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Wednesday 18 March 2026 20:06:16 -0400 (0:00:00.877) 0:27:40.252 ******* ok: [managed-node4] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-loop-2.18-5.el7.x86_64 providing libblockdev-loop is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Wednesday 18 March 2026 20:06:20 -0400 (0:00:04.057) 0:27:44.309 ******* ok: [managed-node4] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Wednesday 18 March 2026 20:06:20 -0400 (0:00:00.374) 0:27:44.684 ******* ok: [managed-node4] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Wednesday 18 March 2026 20:06:20 -0400 (0:00:00.376) 0:27:45.061 ******* ok: [managed-node4] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup", "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Wednesday 18 March 2026 20:06:27 -0400 (0:00:06.206) 0:27:51.267 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Wednesday 18 March 2026 20:06:28 -0400 (0:00:01.426) 0:27:52.693 ******* TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Wednesday 18 March 2026 20:06:28 -0400 (0:00:00.259) 0:27:52.953 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Wednesday 18 March 2026 20:06:29 -0400 (0:00:00.292) 0:27:53.246 ******* TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Wednesday 18 March 2026 20:06:29 -0400 (0:00:00.244) 0:27:53.491 ******* ok: [managed-node4] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed", "7:lvm2-2.02.187-6.el7_9.5.x86_64 providing lvm2 is already installed", "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Wednesday 18 March 2026 20:06:32 -0400 (0:00:03.394) 0:27:56.885 ******* ok: [managed-node4] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Wednesday 18 March 2026 20:06:35 -0400 (0:00:02.705) 0:27:59.590 ******* TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Wednesday 18 March 2026 20:06:36 -0400 (0:00:00.591) 0:28:00.182 ******* changed: [managed-node4] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-741e9deb-aed5-493b-a35c-627bfb56cbc5", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-741e9deb-aed5-493b-a35c-627bfb56cbc5", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-741e9deb-aed5-493b-a35c-627bfb56cbc5", "password": "-", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-741e9deb-aed5-493b-a35c-627bfb56cbc5" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-741e9deb-aed5-493b-a35c-627bfb56cbc5", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "e2fsprogs", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-741e9deb-aed5-493b-a35c-627bfb56cbc5", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-741e9deb-aed5-493b-a35c-627bfb56cbc5", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 Wednesday 18 March 2026 20:06:49 -0400 (0:00:12.978) 0:28:13.160 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Wednesday 18 March 2026 20:06:49 -0400 (0:00:00.225) 0:28:13.386 ******* ok: [managed-node4] => { "changed": false, "stat": { "atime": 1773878634.7604563, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "3fceedeef6c619b69ada96279531b69ed89734ba", "ctime": 1773878634.7574563, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263969, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1773878634.7574563, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1279, "uid": 0, "version": "18446744071680134064", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:97 Wednesday 18 March 2026 20:06:50 -0400 (0:00:01.255) 0:28:14.642 ******* ok: [managed-node4] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Wednesday 18 March 2026 20:06:52 -0400 (0:00:01.573) 0:28:16.215 ******* TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Wednesday 18 March 2026 20:06:52 -0400 (0:00:00.315) 0:28:16.530 ******* ok: [managed-node4] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-741e9deb-aed5-493b-a35c-627bfb56cbc5", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-741e9deb-aed5-493b-a35c-627bfb56cbc5", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-741e9deb-aed5-493b-a35c-627bfb56cbc5", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-741e9deb-aed5-493b-a35c-627bfb56cbc5" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-741e9deb-aed5-493b-a35c-627bfb56cbc5", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "e2fsprogs", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-741e9deb-aed5-493b-a35c-627bfb56cbc5", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-741e9deb-aed5-493b-a35c-627bfb56cbc5", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:130 Wednesday 18 March 2026 20:06:52 -0400 (0:00:00.323) 0:28:16.854 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-741e9deb-aed5-493b-a35c-627bfb56cbc5", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-741e9deb-aed5-493b-a35c-627bfb56cbc5", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:134 Wednesday 18 March 2026 20:06:53 -0400 (0:00:00.342) 0:28:17.196 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:150 Wednesday 18 March 2026 20:06:53 -0400 (0:00:00.373) 0:28:17.569 ******* changed: [managed-node4] => (item={u'src': u'/dev/mapper/foo-test1', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Wednesday 18 March 2026 20:06:54 -0400 (0:00:01.405) 0:28:18.975 ******* ok: [managed-node4] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166 Wednesday 18 March 2026 20:06:56 -0400 (0:00:01.199) 0:28:20.174 ******* changed: [managed-node4] => (item={u'src': u'/dev/mapper/luks-741e9deb-aed5-493b-a35c-627bfb56cbc5', u'group': None, u'dump': 0, u'passno': 0, u'fstype': u'xfs', u'state': u'mounted', u'mode': None, u'owner': None, u'path': u'/opt/test1', u'opts': u'defaults'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-741e9deb-aed5-493b-a35c-627bfb56cbc5", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-741e9deb-aed5-493b-a35c-627bfb56cbc5" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:177 Wednesday 18 March 2026 20:06:57 -0400 (0:00:01.392) 0:28:21.567 ******* skipping: [managed-node4] => (item={u'src': u'/dev/mapper/luks-741e9deb-aed5-493b-a35c-627bfb56cbc5', u'group': None, u'dump': 0, u'passno': 0, u'fstype': u'xfs', u'state': u'mounted', u'mode': None, u'owner': None, u'path': u'/opt/test1', u'opts': u'defaults'}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-741e9deb-aed5-493b-a35c-627bfb56cbc5", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:189 Wednesday 18 March 2026 20:06:57 -0400 (0:00:00.242) 0:28:21.810 ******* ok: [managed-node4] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 Wednesday 18 March 2026 20:06:59 -0400 (0:00:01.744) 0:28:23.554 ******* ok: [managed-node4] => { "changed": false, "stat": { "atime": 1773878652.7545078, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1773878642.0774772, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 264033, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1773878642.0764773, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "18446744071680136663", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:202 Wednesday 18 March 2026 20:07:00 -0400 (0:00:01.043) 0:28:24.598 ******* changed: [managed-node4] => (item={u'state': u'present', u'password': u'-', u'name': u'luks-741e9deb-aed5-493b-a35c-627bfb56cbc5', u'backing_device': u'/dev/mapper/foo-test1'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/mapper/foo-test1", "name": "luks-741e9deb-aed5-493b-a35c-627bfb56cbc5", "password": "-", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 Wednesday 18 March 2026 20:07:01 -0400 (0:00:01.352) 0:28:25.950 ******* ok: [managed-node4] TASK [Verify role results - 10] ************************************************ task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:499 Wednesday 18 March 2026 20:07:04 -0400 (0:00:02.239) 0:28:28.190 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node4 TASK [Print out pool information] ********************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Wednesday 18 March 2026 20:07:04 -0400 (0:00:00.513) 0:28:28.703 ******* ok: [managed-node4] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-741e9deb-aed5-493b-a35c-627bfb56cbc5", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-741e9deb-aed5-493b-a35c-627bfb56cbc5", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Wednesday 18 March 2026 20:07:04 -0400 (0:00:00.222) 0:28:28.926 ******* skipping: [managed-node4] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Wednesday 18 March 2026 20:07:05 -0400 (0:00:00.205) 0:28:29.131 ******* ok: [managed-node4] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "741e9deb-aed5-493b-a35c-627bfb56cbc5" }, "/dev/mapper/luks-741e9deb-aed5-493b-a35c-627bfb56cbc5": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-741e9deb-aed5-493b-a35c-627bfb56cbc5", "size": "4G", "type": "crypt", "uuid": "a0cbcbb8-18bf-45aa-9e28-7c4c7f3601e8" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "qrc5qj-1or5-aUH1-hq0R-233J-AQ7d-XEcSEK" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "ext4", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "c7b7d6a5-fd01-4b9b-bcca-153eaff9d312" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Wednesday 18 March 2026 20:07:06 -0400 (0:00:01.701) 0:28:30.832 ******* ok: [managed-node4] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003028", "end": "2026-03-18 20:07:07.938665", "rc": 0, "start": "2026-03-18 20:07:07.935637" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Thu Jun 20 10:23:46 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk' # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info # UUID=c7b7d6a5-fd01-4b9b-bcca-153eaff9d312 / ext4 defaults 1 1 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-741e9deb-aed5-493b-a35c-627bfb56cbc5 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Wednesday 18 March 2026 20:07:08 -0400 (0:00:01.468) 0:28:32.301 ******* ok: [managed-node4] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003032", "end": "2026-03-18 20:07:09.192105", "failed_when_result": false, "rc": 0, "start": "2026-03-18 20:07:09.189073" } STDOUT: luks-741e9deb-aed5-493b-a35c-627bfb56cbc5 /dev/mapper/foo-test1 - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Wednesday 18 March 2026 20:07:09 -0400 (0:00:01.315) 0:28:33.617 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node4 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Wednesday 18 March 2026 20:07:10 -0400 (0:00:00.480) 0:28:34.098 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Wednesday 18 March 2026 20:07:10 -0400 (0:00:00.200) 0:28:34.298 ******* ok: [managed-node4] => { "changed": false, "cmd": [ "vgs", "--noheadings", "--binary", "-o", "shared", "foo" ], "delta": "0:00:00.017272", "end": "2026-03-18 20:07:11.439764", "rc": 0, "start": "2026-03-18 20:07:11.422492" } STDOUT: 0 TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Wednesday 18 March 2026 20:07:11 -0400 (0:00:01.584) 0:28:35.883 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Wednesday 18 March 2026 20:07:11 -0400 (0:00:00.186) 0:28:36.069 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node4 included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node4 TASK [Set test variables] ****************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Wednesday 18 March 2026 20:07:12 -0400 (0:00:00.580) 0:28:36.649 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Wednesday 18 March 2026 20:07:13 -0400 (0:00:00.491) 0:28:37.141 ******* ok: [managed-node4] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Wednesday 18 March 2026 20:07:14 -0400 (0:00:01.582) 0:28:38.724 ******* ok: [managed-node4] => { "ansible_facts": { "__pvs_lvm_len": "1" }, "changed": false } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Wednesday 18 March 2026 20:07:14 -0400 (0:00:00.349) 0:28:39.073 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "changed": false } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Wednesday 18 March 2026 20:07:15 -0400 (0:00:00.213) 0:28:39.286 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Set expected pv type] **************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Wednesday 18 March 2026 20:07:15 -0400 (0:00:00.337) 0:28:39.624 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type - 2] ************************************************ task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Wednesday 18 March 2026 20:07:15 -0400 (0:00:00.188) 0:28:39.812 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type - 3] ************************************************ task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Wednesday 18 March 2026 20:07:16 -0400 (0:00:00.359) 0:28:40.172 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:55 Wednesday 18 March 2026 20:07:16 -0400 (0:00:00.284) 0:28:40.457 ******* ok: [managed-node4] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:68 Wednesday 18 March 2026 20:07:16 -0400 (0:00:00.259) 0:28:40.717 ******* ok: [managed-node4] => { "changed": false, "failed_when_result": false, "rc": 1 } STDERR: Shared connection to 10.31.44.104 closed. MSG: non-zero return code TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:78 Wednesday 18 March 2026 20:07:18 -0400 (0:00:01.474) 0:28:42.191 ******* skipping: [managed-node4] => (item=/dev/sda) => { "ansible_loop_var": "st_pool_pv", "changed": false, "skip_reason": "Conditional result was False", "st_pool_pv": "/dev/sda" } TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:88 Wednesday 18 March 2026 20:07:18 -0400 (0:00:00.381) 0:28:42.573 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node4 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Wednesday 18 March 2026 20:07:19 -0400 (0:00:00.683) 0:28:43.257 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Wednesday 18 March 2026 20:07:19 -0400 (0:00:00.420) 0:28:43.677 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Wednesday 18 March 2026 20:07:19 -0400 (0:00:00.321) 0:28:43.999 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Wednesday 18 March 2026 20:07:20 -0400 (0:00:00.206) 0:28:44.206 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Wednesday 18 March 2026 20:07:20 -0400 (0:00:00.281) 0:28:44.487 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Wednesday 18 March 2026 20:07:20 -0400 (0:00:00.281) 0:28:44.768 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Wednesday 18 March 2026 20:07:20 -0400 (0:00:00.294) 0:28:45.063 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Wednesday 18 March 2026 20:07:21 -0400 (0:00:00.333) 0:28:45.396 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Wednesday 18 March 2026 20:07:21 -0400 (0:00:00.268) 0:28:45.665 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Wednesday 18 March 2026 20:07:21 -0400 (0:00:00.189) 0:28:45.854 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Wednesday 18 March 2026 20:07:21 -0400 (0:00:00.116) 0:28:45.970 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:91 Wednesday 18 March 2026 20:07:22 -0400 (0:00:00.372) 0:28:46.343 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node4 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Wednesday 18 March 2026 20:07:22 -0400 (0:00:00.594) 0:28:46.938 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml for managed-node4 TASK [Get information about the LV] ******************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:8 Wednesday 18 March 2026 20:07:23 -0400 (0:00:00.620) 0:28:47.558 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:16 Wednesday 18 March 2026 20:07:23 -0400 (0:00:00.353) 0:28:47.912 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:20 Wednesday 18 March 2026 20:07:24 -0400 (0:00:00.357) 0:28:48.269 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV stripe size] ****************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:27 Wednesday 18 March 2026 20:07:24 -0400 (0:00:00.330) 0:28:48.600 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested stripe size] ***************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:31 Wednesday 18 March 2026 20:07:24 -0400 (0:00:00.301) 0:28:48.901 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected stripe size] ************************************************ task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:37 Wednesday 18 March 2026 20:07:25 -0400 (0:00:00.298) 0:28:49.199 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check stripe size] ******************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:42 Wednesday 18 March 2026 20:07:25 -0400 (0:00:00.326) 0:28:49.526 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:94 Wednesday 18 March 2026 20:07:25 -0400 (0:00:00.383) 0:28:49.909 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node4 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Wednesday 18 March 2026 20:07:26 -0400 (0:00:00.919) 0:28:50.828 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml for managed-node4 TASK [Get information about thinpool] ****************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:8 Wednesday 18 March 2026 20:07:27 -0400 (0:00:00.688) 0:28:51.517 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in correct thinpool (when thinp name is provided)] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:16 Wednesday 18 March 2026 20:07:27 -0400 (0:00:00.236) 0:28:51.754 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in thinpool (when thinp name is not provided)] ****** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:22 Wednesday 18 March 2026 20:07:28 -0400 (0:00:00.361) 0:28:52.115 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:26 Wednesday 18 March 2026 20:07:28 -0400 (0:00:00.253) 0:28:52.369 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_thin_status": null }, "changed": false } TASK [Check member encryption] ************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:97 Wednesday 18 March 2026 20:07:28 -0400 (0:00:00.318) 0:28:52.687 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node4 TASK [Set test variables] ****************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Wednesday 18 March 2026 20:07:29 -0400 (0:00:00.756) 0:28:53.443 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Wednesday 18 March 2026 20:07:29 -0400 (0:00:00.365) 0:28:53.809 ******* skipping: [managed-node4] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Wednesday 18 March 2026 20:07:30 -0400 (0:00:00.338) 0:28:54.148 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml for managed-node4 TASK [Set variables used by tests] ********************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:2 Wednesday 18 March 2026 20:07:30 -0400 (0:00:00.539) 0:28:54.687 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:6 Wednesday 18 March 2026 20:07:30 -0400 (0:00:00.358) 0:28:55.046 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:14 Wednesday 18 March 2026 20:07:31 -0400 (0:00:00.197) 0:28:55.244 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:23 Wednesday 18 March 2026 20:07:31 -0400 (0:00:00.301) 0:28:55.545 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:32 Wednesday 18 March 2026 20:07:31 -0400 (0:00:00.259) 0:28:55.804 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:41 Wednesday 18 March 2026 20:07:32 -0400 (0:00:00.380) 0:28:56.184 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [Clear test variables] **************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Wednesday 18 March 2026 20:07:32 -0400 (0:00:00.390) 0:28:56.575 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:100 Wednesday 18 March 2026 20:07:32 -0400 (0:00:00.212) 0:28:56.787 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node4 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Wednesday 18 March 2026 20:07:33 -0400 (0:00:00.845) 0:28:57.633 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml for managed-node4 TASK [Get information about VDO deduplication] ********************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:8 Wednesday 18 March 2026 20:07:35 -0400 (0:00:01.732) 0:28:59.365 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:15 Wednesday 18 March 2026 20:07:35 -0400 (0:00:00.420) 0:28:59.786 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:21 Wednesday 18 March 2026 20:07:36 -0400 (0:00:00.383) 0:29:00.169 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about VDO compression] *********************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:27 Wednesday 18 March 2026 20:07:36 -0400 (0:00:00.329) 0:29:00.499 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off - 2] *********************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:34 Wednesday 18 March 2026 20:07:36 -0400 (0:00:00.471) 0:29:00.970 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on - 2] ************************************ task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:40 Wednesday 18 March 2026 20:07:37 -0400 (0:00:00.356) 0:29:01.327 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:46 Wednesday 18 March 2026 20:07:37 -0400 (0:00:00.346) 0:29:01.673 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:103 Wednesday 18 March 2026 20:07:37 -0400 (0:00:00.316) 0:29:01.989 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node4 TASK [Get stratis pool information] ******************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Wednesday 18 March 2026 20:07:39 -0400 (0:00:01.117) 0:29:03.107 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print script output] ***************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Wednesday 18 March 2026 20:07:39 -0400 (0:00:00.520) 0:29:03.628 ******* skipping: [managed-node4] => {} TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:19 Wednesday 18 March 2026 20:07:39 -0400 (0:00:00.340) 0:29:03.968 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:23 Wednesday 18 March 2026 20:07:40 -0400 (0:00:00.294) 0:29:04.262 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:30 Wednesday 18 March 2026 20:07:40 -0400 (0:00:00.430) 0:29:04.693 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:39 Wednesday 18 March 2026 20:07:41 -0400 (0:00:00.485) 0:29:05.178 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:49 Wednesday 18 March 2026 20:07:41 -0400 (0:00:00.302) 0:29:05.480 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:106 Wednesday 18 March 2026 20:07:41 -0400 (0:00:00.322) 0:29:05.803 ******* ok: [managed-node4] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Wednesday 18 March 2026 20:07:42 -0400 (0:00:00.356) 0:29:06.160 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node4 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Wednesday 18 March 2026 20:07:42 -0400 (0:00:00.696) 0:29:06.857 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Wednesday 18 March 2026 20:07:43 -0400 (0:00:00.528) 0:29:07.386 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node4 included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node4 included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node4 included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node4 included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node4 included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node4 included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node4 included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node4 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Wednesday 18 March 2026 20:07:45 -0400 (0:00:02.308) 0:29:09.695 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-741e9deb-aed5-493b-a35c-627bfb56cbc5" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Wednesday 18 March 2026 20:07:45 -0400 (0:00:00.251) 0:29:09.946 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Wednesday 18 March 2026 20:07:46 -0400 (0:00:00.438) 0:29:10.385 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Wednesday 18 March 2026 20:07:46 -0400 (0:00:00.407) 0:29:10.793 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Wednesday 18 March 2026 20:07:47 -0400 (0:00:00.352) 0:29:11.146 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Wednesday 18 March 2026 20:07:47 -0400 (0:00:00.422) 0:29:11.569 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Wednesday 18 March 2026 20:07:47 -0400 (0:00:00.386) 0:29:11.955 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Wednesday 18 March 2026 20:07:48 -0400 (0:00:00.329) 0:29:12.285 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Wednesday 18 March 2026 20:07:48 -0400 (0:00:00.275) 0:29:12.560 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Wednesday 18 March 2026 20:07:48 -0400 (0:00:00.334) 0:29:12.895 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Wednesday 18 March 2026 20:07:49 -0400 (0:00:00.387) 0:29:13.282 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Wednesday 18 March 2026 20:07:49 -0400 (0:00:00.268) 0:29:13.550 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-741e9deb-aed5-493b-a35c-627bfb56cbc5 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Wednesday 18 March 2026 20:07:50 -0400 (0:00:00.614) 0:29:14.164 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Wednesday 18 March 2026 20:07:50 -0400 (0:00:00.414) 0:29:14.579 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Wednesday 18 March 2026 20:07:50 -0400 (0:00:00.485) 0:29:15.064 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Wednesday 18 March 2026 20:07:51 -0400 (0:00:00.315) 0:29:15.380 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Wednesday 18 March 2026 20:07:51 -0400 (0:00:00.458) 0:29:15.838 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Wednesday 18 March 2026 20:07:52 -0400 (0:00:00.406) 0:29:16.245 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Wednesday 18 March 2026 20:07:52 -0400 (0:00:00.371) 0:29:16.617 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Wednesday 18 March 2026 20:07:53 -0400 (0:00:00.530) 0:29:17.147 ******* ok: [managed-node4] => { "changed": false, "stat": { "atime": 1773878808.4019518, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1773878808.4019518, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 190946, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1773878808.4019518, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Wednesday 18 March 2026 20:07:55 -0400 (0:00:02.002) 0:29:19.150 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Wednesday 18 March 2026 20:07:55 -0400 (0:00:00.527) 0:29:19.677 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Wednesday 18 March 2026 20:07:55 -0400 (0:00:00.235) 0:29:19.913 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Wednesday 18 March 2026 20:07:56 -0400 (0:00:00.435) 0:29:20.349 ******* ok: [managed-node4] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Wednesday 18 March 2026 20:07:56 -0400 (0:00:00.339) 0:29:20.688 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Wednesday 18 March 2026 20:07:56 -0400 (0:00:00.388) 0:29:21.077 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Wednesday 18 March 2026 20:07:57 -0400 (0:00:00.345) 0:29:21.422 ******* ok: [managed-node4] => { "changed": false, "stat": { "atime": 1773878808.5089521, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1773878808.5089521, "dev": 5, "device_type": 64769, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 203394, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1773878808.5089521, "nlink": 1, "path": "/dev/mapper/luks-741e9deb-aed5-493b-a35c-627bfb56cbc5", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Wednesday 18 March 2026 20:07:59 -0400 (0:00:01.986) 0:29:23.409 ******* ok: [managed-node4] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed" ] } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Wednesday 18 March 2026 20:08:02 -0400 (0:00:03.398) 0:29:26.808 ******* ok: [managed-node4] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/mapper/foo-test1" ], "delta": "0:00:00.025334", "end": "2026-03-18 20:08:04.440384", "rc": 0, "start": "2026-03-18 20:08:04.415050" } STDOUT: LUKS header information for /dev/mapper/foo-test1 Version: 1 Cipher name: aes Cipher mode: xts-plain64 Hash spec: sha256 Payload offset: 8192 MK bits: 512 MK digest: 5e de 9a 3b 2d 19 ff 47 37 2b 07 4f 9f 5a 3e 26 16 be 53 7a MK salt: 85 82 ee 45 5f 47 5f cc 1e 12 ab 9d f8 d4 9f af ac 2b d2 7b 8a 26 88 20 13 7a 79 4a 72 d6 6d 08 MK iterations: 23108 UUID: 741e9deb-aed5-493b-a35c-627bfb56cbc5 Key Slot 0: ENABLED Iterations: 371308 Salt: 89 ce 53 01 f8 a8 34 93 6d fe 3d 4f 2f c0 a4 84 f9 b7 f7 85 c9 a1 7e 4d 5e 71 26 46 51 54 2b 87 Key material offset: 8 AF stripes: 4000 Key Slot 1: DISABLED Key Slot 2: DISABLED Key Slot 3: DISABLED Key Slot 4: DISABLED Key Slot 5: DISABLED Key Slot 6: DISABLED Key Slot 7: DISABLED TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Wednesday 18 March 2026 20:08:04 -0400 (0:00:02.101) 0:29:28.909 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Wednesday 18 March 2026 20:08:05 -0400 (0:00:00.483) 0:29:29.393 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Wednesday 18 March 2026 20:08:05 -0400 (0:00:00.442) 0:29:29.835 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Wednesday 18 March 2026 20:08:06 -0400 (0:00:00.435) 0:29:30.271 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Wednesday 18 March 2026 20:08:06 -0400 (0:00:00.489) 0:29:30.761 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Wednesday 18 March 2026 20:08:07 -0400 (0:00:00.417) 0:29:31.178 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Wednesday 18 March 2026 20:08:07 -0400 (0:00:00.354) 0:29:31.533 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Wednesday 18 March 2026 20:08:07 -0400 (0:00:00.289) 0:29:31.822 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-741e9deb-aed5-493b-a35c-627bfb56cbc5 /dev/mapper/foo-test1 -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Wednesday 18 March 2026 20:08:08 -0400 (0:00:00.391) 0:29:32.214 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Wednesday 18 March 2026 20:08:08 -0400 (0:00:00.365) 0:29:32.580 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Wednesday 18 March 2026 20:08:08 -0400 (0:00:00.440) 0:29:33.020 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Wednesday 18 March 2026 20:08:09 -0400 (0:00:00.416) 0:29:33.437 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Wednesday 18 March 2026 20:08:09 -0400 (0:00:00.352) 0:29:33.789 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Wednesday 18 March 2026 20:08:10 -0400 (0:00:00.357) 0:29:34.146 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Wednesday 18 March 2026 20:08:10 -0400 (0:00:00.371) 0:29:34.518 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Wednesday 18 March 2026 20:08:10 -0400 (0:00:00.262) 0:29:34.780 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Wednesday 18 March 2026 20:08:10 -0400 (0:00:00.277) 0:29:35.058 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Wednesday 18 March 2026 20:08:11 -0400 (0:00:00.359) 0:29:35.417 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Wednesday 18 March 2026 20:08:11 -0400 (0:00:00.308) 0:29:35.725 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Wednesday 18 March 2026 20:08:12 -0400 (0:00:00.404) 0:29:36.130 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Wednesday 18 March 2026 20:08:12 -0400 (0:00:00.311) 0:29:36.442 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Wednesday 18 March 2026 20:08:12 -0400 (0:00:00.356) 0:29:36.798 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Wednesday 18 March 2026 20:08:12 -0400 (0:00:00.283) 0:29:37.082 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Wednesday 18 March 2026 20:08:13 -0400 (0:00:00.239) 0:29:37.322 ******* ok: [managed-node4] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Wednesday 18 March 2026 20:08:15 -0400 (0:00:01.928) 0:29:39.250 ******* ok: [managed-node4] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Wednesday 18 March 2026 20:08:17 -0400 (0:00:01.988) 0:29:41.239 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [Show expected size] ****************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Wednesday 18 March 2026 20:08:17 -0400 (0:00:00.360) 0:29:41.599 ******* ok: [managed-node4] => { "storage_test_expected_size": "4294967296" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Wednesday 18 March 2026 20:08:17 -0400 (0:00:00.222) 0:29:41.821 ******* ok: [managed-node4] => { "bytes": 10737418240, "changed": false, "lvm": "10g", "parted": "10GiB", "size": "10 GiB" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Wednesday 18 March 2026 20:08:19 -0400 (0:00:02.184) 0:29:44.005 ******* skipping: [managed-node4] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Wednesday 18 March 2026 20:08:20 -0400 (0:00:00.415) 0:29:44.421 ******* skipping: [managed-node4] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Wednesday 18 March 2026 20:08:20 -0400 (0:00:00.311) 0:29:44.733 ******* skipping: [managed-node4] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Wednesday 18 March 2026 20:08:21 -0400 (0:00:00.372) 0:29:45.106 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Wednesday 18 March 2026 20:08:21 -0400 (0:00:00.346) 0:29:45.452 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Wednesday 18 March 2026 20:08:21 -0400 (0:00:00.434) 0:29:45.886 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Wednesday 18 March 2026 20:08:22 -0400 (0:00:00.388) 0:29:46.274 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Wednesday 18 March 2026 20:08:22 -0400 (0:00:00.400) 0:29:46.675 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Wednesday 18 March 2026 20:08:22 -0400 (0:00:00.295) 0:29:46.970 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Wednesday 18 March 2026 20:08:23 -0400 (0:00:00.323) 0:29:47.294 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Wednesday 18 March 2026 20:08:23 -0400 (0:00:00.396) 0:29:47.690 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Wednesday 18 March 2026 20:08:23 -0400 (0:00:00.380) 0:29:48.071 ******* skipping: [managed-node4] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Wednesday 18 March 2026 20:08:24 -0400 (0:00:00.342) 0:29:48.413 ******* skipping: [managed-node4] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Wednesday 18 March 2026 20:08:24 -0400 (0:00:00.307) 0:29:48.720 ******* skipping: [managed-node4] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Wednesday 18 March 2026 20:08:24 -0400 (0:00:00.310) 0:29:49.031 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Wednesday 18 March 2026 20:08:25 -0400 (0:00:00.482) 0:29:49.514 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Wednesday 18 March 2026 20:08:25 -0400 (0:00:00.365) 0:29:49.879 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Wednesday 18 March 2026 20:08:26 -0400 (0:00:00.344) 0:29:50.224 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Wednesday 18 March 2026 20:08:26 -0400 (0:00:00.369) 0:29:50.593 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Wednesday 18 March 2026 20:08:26 -0400 (0:00:00.355) 0:29:50.949 ******* ok: [managed-node4] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Wednesday 18 March 2026 20:08:27 -0400 (0:00:00.449) 0:29:51.399 ******* ok: [managed-node4] => { "storage_test_expected_size": "4294967296" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Wednesday 18 March 2026 20:08:27 -0400 (0:00:00.315) 0:29:51.714 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Wednesday 18 March 2026 20:08:28 -0400 (0:00:00.391) 0:29:52.105 ******* ok: [managed-node4] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.018730", "end": "2026-03-18 20:08:29.616758", "rc": 0, "start": "2026-03-18 20:08:29.598028" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Wednesday 18 March 2026 20:08:29 -0400 (0:00:01.901) 0:29:54.007 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [Check segment type] ****************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Wednesday 18 March 2026 20:08:30 -0400 (0:00:00.469) 0:29:54.477 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Wednesday 18 March 2026 20:08:30 -0400 (0:00:00.617) 0:29:55.094 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Wednesday 18 March 2026 20:08:31 -0400 (0:00:00.277) 0:29:55.372 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Wednesday 18 March 2026 20:08:31 -0400 (0:00:00.376) 0:29:55.748 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Wednesday 18 March 2026 20:08:32 -0400 (0:00:00.386) 0:29:56.134 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Wednesday 18 March 2026 20:08:32 -0400 (0:00:00.305) 0:29:56.439 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Wednesday 18 March 2026 20:08:32 -0400 (0:00:00.358) 0:29:56.798 ******* TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Wednesday 18 March 2026 20:08:32 -0400 (0:00:00.233) 0:29:57.031 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Clean up] **************************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:502 Wednesday 18 March 2026 20:08:33 -0400 (0:00:00.493) 0:29:57.525 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node4 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:24 Wednesday 18 March 2026 20:08:34 -0400 (0:00:00.839) 0:29:58.364 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:34 Wednesday 18 March 2026 20:08:34 -0400 (0:00:00.354) 0:29:58.719 ******* TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Wednesday 18 March 2026 20:08:34 -0400 (0:00:00.302) 0:29:59.022 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Wednesday 18 March 2026 20:08:35 -0400 (0:00:00.483) 0:29:59.505 ******* ok: [managed-node4] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Wednesday 18 March 2026 20:08:38 -0400 (0:00:03.112) 0:30:02.617 ******* skipping: [managed-node4] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node4] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node4] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-loop", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed-node4] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Wednesday 18 March 2026 20:08:39 -0400 (0:00:00.852) 0:30:03.470 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Wednesday 18 March 2026 20:08:39 -0400 (0:00:00.324) 0:30:03.795 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Wednesday 18 March 2026 20:08:39 -0400 (0:00:00.296) 0:30:04.091 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Wednesday 18 March 2026 20:08:40 -0400 (0:00:00.250) 0:30:04.342 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Wednesday 18 March 2026 20:08:40 -0400 (0:00:00.279) 0:30:04.621 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Wednesday 18 March 2026 20:08:41 -0400 (0:00:00.800) 0:30:05.422 ******* ok: [managed-node4] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-loop-2.18-5.el7.x86_64 providing libblockdev-loop is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Wednesday 18 March 2026 20:08:45 -0400 (0:00:04.183) 0:30:09.605 ******* ok: [managed-node4] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Wednesday 18 March 2026 20:08:45 -0400 (0:00:00.383) 0:30:09.989 ******* ok: [managed-node4] => { "storage_volumes | d([])": [ { "disks": [ "sda" ], "name": "foo", "state": "absent", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Wednesday 18 March 2026 20:08:46 -0400 (0:00:00.441) 0:30:10.430 ******* ok: [managed-node4] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Wednesday 18 March 2026 20:08:52 -0400 (0:00:05.866) 0:30:16.296 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Wednesday 18 March 2026 20:08:52 -0400 (0:00:00.484) 0:30:16.781 ******* TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Wednesday 18 March 2026 20:08:52 -0400 (0:00:00.279) 0:30:17.061 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Wednesday 18 March 2026 20:08:53 -0400 (0:00:00.337) 0:30:17.398 ******* TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Wednesday 18 March 2026 20:08:53 -0400 (0:00:00.297) 0:30:17.696 ******* ok: [managed-node4] => { "changed": false, "rc": 0, "results": [ "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Wednesday 18 March 2026 20:08:57 -0400 (0:00:03.773) 0:30:21.469 ******* ok: [managed-node4] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Wednesday 18 March 2026 20:09:00 -0400 (0:00:03.168) 0:30:24.638 ******* TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Wednesday 18 March 2026 20:09:01 -0400 (0:00:00.476) 0:30:25.114 ******* changed: [managed-node4] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-741e9deb-aed5-493b-a35c-627bfb56cbc5", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-741e9deb-aed5-493b-a35c-627bfb56cbc5", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "destroy device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "lvmpv" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-741e9deb-aed5-493b-a35c-627bfb56cbc5", "password": "-", "state": "absent" } ], "leaves": [ "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-741e9deb-aed5-493b-a35c-627bfb56cbc5", "state": "absent" } ], "packages": [ "e2fsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_mount_id": "UUID=qrc5qj-1or5-aUH1-hq0R-233J-AQ7d-XEcSEK", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "lvmpv", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "absent", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 Wednesday 18 March 2026 20:09:38 -0400 (0:00:37.072) 0:31:02.187 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Wednesday 18 March 2026 20:09:38 -0400 (0:00:00.535) 0:31:02.723 ******* ok: [managed-node4] => { "changed": false, "stat": { "atime": 1773878817.226977, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "a426b5bd8145de6d54caef56dfee67dcdb35c41c", "ctime": 1773878817.223977, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263969, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1773878817.223977, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1311, "uid": 0, "version": "18446744071680134064", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:97 Wednesday 18 March 2026 20:09:40 -0400 (0:00:02.328) 0:31:05.051 ******* ok: [managed-node4] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Wednesday 18 March 2026 20:09:42 -0400 (0:00:01.860) 0:31:06.911 ******* TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Wednesday 18 March 2026 20:09:43 -0400 (0:00:00.499) 0:31:07.411 ******* ok: [managed-node4] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-741e9deb-aed5-493b-a35c-627bfb56cbc5", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-741e9deb-aed5-493b-a35c-627bfb56cbc5", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "destroy device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "lvmpv" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-741e9deb-aed5-493b-a35c-627bfb56cbc5", "password": "-", "state": "absent" } ], "failed": false, "leaves": [ "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-741e9deb-aed5-493b-a35c-627bfb56cbc5", "state": "absent" } ], "packages": [ "e2fsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_mount_id": "UUID=qrc5qj-1or5-aUH1-hq0R-233J-AQ7d-XEcSEK", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "lvmpv", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "absent", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:130 Wednesday 18 March 2026 20:09:43 -0400 (0:00:00.501) 0:31:07.913 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:134 Wednesday 18 March 2026 20:09:44 -0400 (0:00:00.220) 0:31:08.133 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/sda", "_mount_id": "UUID=qrc5qj-1or5-aUH1-hq0R-233J-AQ7d-XEcSEK", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "lvmpv", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "absent", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:150 Wednesday 18 March 2026 20:09:44 -0400 (0:00:00.435) 0:31:08.569 ******* changed: [managed-node4] => (item={u'src': u'/dev/mapper/luks-741e9deb-aed5-493b-a35c-627bfb56cbc5', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-741e9deb-aed5-493b-a35c-627bfb56cbc5", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-741e9deb-aed5-493b-a35c-627bfb56cbc5" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Wednesday 18 March 2026 20:09:46 -0400 (0:00:02.228) 0:31:10.798 ******* ok: [managed-node4] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166 Wednesday 18 March 2026 20:09:49 -0400 (0:00:02.369) 0:31:13.167 ******* TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:177 Wednesday 18 March 2026 20:09:49 -0400 (0:00:00.419) 0:31:13.587 ******* TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:189 Wednesday 18 March 2026 20:09:49 -0400 (0:00:00.406) 0:31:13.993 ******* ok: [managed-node4] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 Wednesday 18 March 2026 20:09:52 -0400 (0:00:02.193) 0:31:16.186 ******* ok: [managed-node4] => { "changed": false, "stat": { "atime": 1773878829.1910112, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "64b914b552862ae39f29bf01b7e2e3a0f2502d12", "ctime": 1773878821.6139896, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 264034, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1773878821.6129894, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 66, "uid": 0, "version": "18446744071680136870", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:202 Wednesday 18 March 2026 20:09:53 -0400 (0:00:01.811) 0:31:17.998 ******* changed: [managed-node4] => (item={u'state': u'absent', u'password': u'-', u'name': u'luks-741e9deb-aed5-493b-a35c-627bfb56cbc5', u'backing_device': u'/dev/mapper/foo-test1'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/mapper/foo-test1", "name": "luks-741e9deb-aed5-493b-a35c-627bfb56cbc5", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 Wednesday 18 March 2026 20:09:55 -0400 (0:00:02.045) 0:31:20.043 ******* ok: [managed-node4] TASK [Verify role results - 11] ************************************************ task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:511 Wednesday 18 March 2026 20:09:58 -0400 (0:00:02.052) 0:31:22.096 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node4 TASK [Print out pool information] ********************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Wednesday 18 March 2026 20:09:58 -0400 (0:00:00.746) 0:31:22.843 ******* skipping: [managed-node4] => {} TASK [Print out volume information] ******************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Wednesday 18 March 2026 20:09:59 -0400 (0:00:00.301) 0:31:23.144 ******* ok: [managed-node4] => { "_storage_volumes_list": [ { "_device": "/dev/sda", "_mount_id": "UUID=qrc5qj-1or5-aUH1-hq0R-233J-AQ7d-XEcSEK", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "lvmpv", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "absent", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Wednesday 18 March 2026 20:09:59 -0400 (0:00:00.238) 0:31:23.383 ******* ok: [managed-node4] => { "changed": false, "info": { "/dev/sda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "ext4", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "c7b7d6a5-fd01-4b9b-bcca-153eaff9d312" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Wednesday 18 March 2026 20:10:00 -0400 (0:00:01.279) 0:31:24.662 ******* ok: [managed-node4] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002914", "end": "2026-03-18 20:10:01.965819", "rc": 0, "start": "2026-03-18 20:10:01.962905" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Thu Jun 20 10:23:46 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk' # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info # UUID=c7b7d6a5-fd01-4b9b-bcca-153eaff9d312 / ext4 defaults 1 1 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Wednesday 18 March 2026 20:10:02 -0400 (0:00:01.714) 0:31:26.377 ******* ok: [managed-node4] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003013", "end": "2026-03-18 20:10:03.860850", "failed_when_result": false, "rc": 0, "start": "2026-03-18 20:10:03.857837" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Wednesday 18 March 2026 20:10:04 -0400 (0:00:01.978) 0:31:28.355 ******* TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Wednesday 18 March 2026 20:10:04 -0400 (0:00:00.239) 0:31:28.594 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node4 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Wednesday 18 March 2026 20:10:05 -0400 (0:00:00.540) 0:31:29.135 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_volume_present": false, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Wednesday 18 March 2026 20:10:05 -0400 (0:00:00.528) 0:31:29.664 ******* included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node4 included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node4 included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node4 included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node4 included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node4 included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node4 included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node4 included: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node4 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Wednesday 18 March 2026 20:10:07 -0400 (0:00:01.842) 0:31:31.506 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_device_path": "/dev/sda" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Wednesday 18 March 2026 20:10:07 -0400 (0:00:00.327) 0:31:31.834 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Wednesday 18 March 2026 20:10:08 -0400 (0:00:00.534) 0:31:32.368 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Wednesday 18 March 2026 20:10:08 -0400 (0:00:00.366) 0:31:32.735 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Wednesday 18 March 2026 20:10:08 -0400 (0:00:00.258) 0:31:32.993 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Wednesday 18 March 2026 20:10:09 -0400 (0:00:00.246) 0:31:33.239 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Wednesday 18 March 2026 20:10:09 -0400 (0:00:00.256) 0:31:33.496 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Wednesday 18 March 2026 20:10:09 -0400 (0:00:00.256) 0:31:33.753 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Wednesday 18 March 2026 20:10:09 -0400 (0:00:00.307) 0:31:34.060 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Wednesday 18 March 2026 20:10:10 -0400 (0:00:00.273) 0:31:34.334 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Wednesday 18 March 2026 20:10:10 -0400 (0:00:00.316) 0:31:34.651 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Wednesday 18 March 2026 20:10:10 -0400 (0:00:00.262) 0:31:34.913 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "0", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Wednesday 18 March 2026 20:10:11 -0400 (0:00:00.726) 0:31:35.640 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Wednesday 18 March 2026 20:10:11 -0400 (0:00:00.299) 0:31:35.939 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Wednesday 18 March 2026 20:10:12 -0400 (0:00:00.346) 0:31:36.286 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Wednesday 18 March 2026 20:10:12 -0400 (0:00:00.227) 0:31:36.513 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Wednesday 18 March 2026 20:10:12 -0400 (0:00:00.217) 0:31:36.731 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Wednesday 18 March 2026 20:10:12 -0400 (0:00:00.147) 0:31:36.878 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fs label] ********************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Wednesday 18 March 2026 20:10:13 -0400 (0:00:00.288) 0:31:37.166 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [See whether the device node is present] ********************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Wednesday 18 March 2026 20:10:13 -0400 (0:00:00.340) 0:31:37.507 ******* ok: [managed-node4] => { "changed": false, "stat": { "atime": 1773878977.5804348, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1773878977.5804348, "dev": 5, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 28971, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1773878977.5804348, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Wednesday 18 March 2026 20:10:15 -0400 (0:00:01.949) 0:31:39.457 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Wednesday 18 March 2026 20:10:15 -0400 (0:00:00.620) 0:31:40.078 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Wednesday 18 March 2026 20:10:16 -0400 (0:00:00.369) 0:31:40.447 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Wednesday 18 March 2026 20:10:16 -0400 (0:00:00.242) 0:31:40.690 ******* ok: [managed-node4] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Wednesday 18 March 2026 20:10:16 -0400 (0:00:00.281) 0:31:40.972 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Wednesday 18 March 2026 20:10:17 -0400 (0:00:00.291) 0:31:41.263 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Wednesday 18 March 2026 20:10:17 -0400 (0:00:00.228) 0:31:41.492 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Wednesday 18 March 2026 20:10:17 -0400 (0:00:00.459) 0:31:41.951 ******* ok: [managed-node4] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed" ] } TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Wednesday 18 March 2026 20:10:21 -0400 (0:00:03.694) 0:31:45.646 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Wednesday 18 March 2026 20:10:21 -0400 (0:00:00.298) 0:31:45.944 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Wednesday 18 March 2026 20:10:22 -0400 (0:00:00.365) 0:31:46.310 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Wednesday 18 March 2026 20:10:22 -0400 (0:00:00.272) 0:31:46.583 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Wednesday 18 March 2026 20:10:22 -0400 (0:00:00.257) 0:31:46.840 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Wednesday 18 March 2026 20:10:23 -0400 (0:00:00.389) 0:31:47.230 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Wednesday 18 March 2026 20:10:23 -0400 (0:00:00.217) 0:31:47.447 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Wednesday 18 March 2026 20:10:23 -0400 (0:00:00.267) 0:31:47.715 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Wednesday 18 March 2026 20:10:23 -0400 (0:00:00.239) 0:31:47.955 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Wednesday 18 March 2026 20:10:24 -0400 (0:00:00.478) 0:31:48.433 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Wednesday 18 March 2026 20:10:24 -0400 (0:00:00.472) 0:31:48.906 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Wednesday 18 March 2026 20:10:25 -0400 (0:00:00.438) 0:31:49.345 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Wednesday 18 March 2026 20:10:25 -0400 (0:00:00.301) 0:31:49.647 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Wednesday 18 March 2026 20:10:25 -0400 (0:00:00.333) 0:31:49.980 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Wednesday 18 March 2026 20:10:26 -0400 (0:00:00.369) 0:31:50.350 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Wednesday 18 March 2026 20:10:26 -0400 (0:00:00.281) 0:31:50.632 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Wednesday 18 March 2026 20:10:26 -0400 (0:00:00.350) 0:31:50.982 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Wednesday 18 March 2026 20:10:27 -0400 (0:00:00.304) 0:31:51.287 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Wednesday 18 March 2026 20:10:27 -0400 (0:00:00.419) 0:31:51.707 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Wednesday 18 March 2026 20:10:27 -0400 (0:00:00.322) 0:31:52.030 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Wednesday 18 March 2026 20:10:28 -0400 (0:00:00.350) 0:31:52.380 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Wednesday 18 March 2026 20:10:28 -0400 (0:00:00.299) 0:31:52.679 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Wednesday 18 March 2026 20:10:28 -0400 (0:00:00.310) 0:31:52.990 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Wednesday 18 March 2026 20:10:29 -0400 (0:00:00.343) 0:31:53.334 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Wednesday 18 March 2026 20:10:29 -0400 (0:00:00.323) 0:31:53.657 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Wednesday 18 March 2026 20:10:29 -0400 (0:00:00.253) 0:31:53.911 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Wednesday 18 March 2026 20:10:30 -0400 (0:00:00.382) 0:31:54.293 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Wednesday 18 March 2026 20:10:30 -0400 (0:00:00.232) 0:31:54.525 ******* ok: [managed-node4] => { "storage_test_expected_size": "4294967296" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Wednesday 18 March 2026 20:10:30 -0400 (0:00:00.266) 0:31:54.792 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Wednesday 18 March 2026 20:10:30 -0400 (0:00:00.293) 0:31:55.086 ******* skipping: [managed-node4] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Wednesday 18 March 2026 20:10:31 -0400 (0:00:00.285) 0:31:55.371 ******* skipping: [managed-node4] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Wednesday 18 March 2026 20:10:31 -0400 (0:00:00.256) 0:31:55.628 ******* skipping: [managed-node4] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Wednesday 18 March 2026 20:10:31 -0400 (0:00:00.301) 0:31:55.930 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Wednesday 18 March 2026 20:10:32 -0400 (0:00:00.315) 0:31:56.245 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Wednesday 18 March 2026 20:10:32 -0400 (0:00:00.151) 0:31:56.397 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Wednesday 18 March 2026 20:10:32 -0400 (0:00:00.239) 0:31:56.637 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Wednesday 18 March 2026 20:10:32 -0400 (0:00:00.348) 0:31:56.985 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Wednesday 18 March 2026 20:10:33 -0400 (0:00:00.338) 0:31:57.324 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Wednesday 18 March 2026 20:10:33 -0400 (0:00:00.289) 0:31:57.613 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Wednesday 18 March 2026 20:10:33 -0400 (0:00:00.303) 0:31:57.917 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Wednesday 18 March 2026 20:10:34 -0400 (0:00:00.241) 0:31:58.159 ******* skipping: [managed-node4] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Wednesday 18 March 2026 20:10:34 -0400 (0:00:00.308) 0:31:58.468 ******* skipping: [managed-node4] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Wednesday 18 March 2026 20:10:34 -0400 (0:00:00.391) 0:31:58.859 ******* skipping: [managed-node4] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Wednesday 18 March 2026 20:10:35 -0400 (0:00:00.302) 0:31:59.161 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Wednesday 18 March 2026 20:10:35 -0400 (0:00:00.214) 0:31:59.376 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Wednesday 18 March 2026 20:10:35 -0400 (0:00:00.277) 0:31:59.654 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Wednesday 18 March 2026 20:10:35 -0400 (0:00:00.214) 0:31:59.868 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Wednesday 18 March 2026 20:10:36 -0400 (0:00:00.229) 0:32:00.098 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Wednesday 18 March 2026 20:10:36 -0400 (0:00:00.211) 0:32:00.309 ******* ok: [managed-node4] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Wednesday 18 March 2026 20:10:36 -0400 (0:00:00.202) 0:32:00.512 ******* ok: [managed-node4] => { "storage_test_expected_size": "4294967296" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Wednesday 18 March 2026 20:10:36 -0400 (0:00:00.197) 0:32:00.709 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Wednesday 18 March 2026 20:10:36 -0400 (0:00:00.155) 0:32:00.865 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Wednesday 18 March 2026 20:10:37 -0400 (0:00:00.272) 0:32:01.137 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Wednesday 18 March 2026 20:10:37 -0400 (0:00:00.252) 0:32:01.390 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Wednesday 18 March 2026 20:10:37 -0400 (0:00:00.270) 0:32:01.661 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Wednesday 18 March 2026 20:10:37 -0400 (0:00:00.209) 0:32:01.871 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Wednesday 18 March 2026 20:10:38 -0400 (0:00:00.322) 0:32:02.193 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Wednesday 18 March 2026 20:10:38 -0400 (0:00:00.204) 0:32:02.398 ******* skipping: [managed-node4] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Wednesday 18 March 2026 20:10:38 -0400 (0:00:00.239) 0:32:02.637 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Wednesday 18 March 2026 20:10:38 -0400 (0:00:00.260) 0:32:02.897 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* managed-node4 : ok=1245 changed=60 unreachable=0 failed=9 skipped=1073 rescued=9 ignored=0 SYSTEM ROLES ERRORS BEGIN v1 [ { "ansible_version": "2.9.27", "end_time": "2026-03-18T23:40:11.792246Z", "host": "managed-node4", "message": "encrypted volume 'foo' missing key/password", "start_time": "2026-03-18T23:40:05.960880Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70" }, { "ansible_version": "2.9.27", "end_time": "2026-03-18T23:40:12.069265Z", "host": "managed-node4", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [], "safe_mode": true, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } }, "leaves": [], "mounts": [], "msg": "encrypted volume 'foo' missing key/password", "packages": [], "pools": [], "volumes": [] }, "start_time": "2026-03-18T23:40:11.845455Z", "task_name": "Failed message", "task_path": "/tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111" }, { "ansible_version": "2.9.27", "end_time": "2026-03-18T23:42:46.012901Z", "host": "managed-node4", "message": "cannot remove existing formatting on device 'luks-5dfb6f1c-458e-4933-922c-32b2268852fe' in safe mode due to encryption removal", "start_time": "2026-03-18T23:42:40.277090Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70" }, { "ansible_version": "2.9.27", "end_time": "2026-03-18T23:42:46.258877Z", "host": "managed-node4", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [], "safe_mode": true, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10733223936, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } }, "leaves": [], "mounts": [], "msg": "cannot remove existing formatting on device 'luks-5dfb6f1c-458e-4933-922c-32b2268852fe' in safe mode due to encryption removal", "packages": [], "pools": [], "volumes": [] }, "start_time": "2026-03-18T23:42:46.033388Z", "task_name": "Failed message", "task_path": "/tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111" }, { "ansible_version": "2.9.27", "end_time": "2026-03-18T23:45:01.039229Z", "host": "managed-node4", "message": "cannot remove existing formatting on device 'sda' in safe mode due to adding encryption", "start_time": "2026-03-18T23:44:55.335878Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70" }, { "ansible_version": "2.9.27", "end_time": "2026-03-18T23:45:01.577379Z", "host": "managed-node4", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [], "safe_mode": true, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } }, "leaves": [], "mounts": [], "msg": "cannot remove existing formatting on device 'sda' in safe mode due to adding encryption", "packages": [], "pools": [], "volumes": [] }, "start_time": "2026-03-18T23:45:01.117571Z", "task_name": "Failed message", "task_path": "/tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111" }, { "ansible_version": "2.9.27", "end_time": "2026-03-18T23:47:26.089579Z", "host": "managed-node4", "message": "encrypted volume 'test1' missing key/password", "start_time": "2026-03-18T23:47:20.707223Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70" }, { "ansible_version": "2.9.27", "end_time": "2026-03-18T23:47:26.470706Z", "host": "managed-node4", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "safe_mode": false, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [] } }, "leaves": [], "mounts": [], "msg": "encrypted volume 'test1' missing key/password", "packages": [], "pools": [], "volumes": [] }, "start_time": "2026-03-18T23:47:26.132578Z", "task_name": "Failed message", "task_path": "/tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111" }, { "ansible_version": "2.9.27", "end_time": "2026-03-18T23:50:23.307945Z", "host": "managed-node4", "message": "cannot remove existing formatting on device 'luks-899c6988-aa8f-4f93-9423-a0a1748937ea' in safe mode due to encryption removal", "start_time": "2026-03-18T23:50:17.323101Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70" }, { "ansible_version": "2.9.27", "end_time": "2026-03-18T23:50:23.855346Z", "host": "managed-node4", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "safe_mode": true, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [] } }, "leaves": [], "mounts": [], "msg": "cannot remove existing formatting on device 'luks-899c6988-aa8f-4f93-9423-a0a1748937ea' in safe mode due to encryption removal", "packages": [], "pools": [], "volumes": [] }, "start_time": "2026-03-18T23:50:23.409230Z", "task_name": "Failed message", "task_path": "/tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111" }, { "ansible_version": "2.9.27", "end_time": "2026-03-18T23:53:10.760002Z", "host": "managed-node4", "message": "cannot remove existing formatting on device 'sda1' in safe mode due to adding encryption", "start_time": "2026-03-18T23:53:04.848677Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70" }, { "ansible_version": "2.9.27", "end_time": "2026-03-18T23:53:11.355737Z", "host": "managed-node4", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "safe_mode": true, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [] } }, "leaves": [], "mounts": [], "msg": "cannot remove existing formatting on device 'sda1' in safe mode due to adding encryption", "packages": [], "pools": [], "volumes": [] }, "start_time": "2026-03-18T23:53:10.844848Z", "task_name": "Failed message", "task_path": "/tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111" }, { "ansible_version": "2.9.27", "end_time": "2026-03-18T23:56:16.239033Z", "host": "managed-node4", "message": "encrypted volume 'test1' missing key/password", "start_time": "2026-03-18T23:56:09.881270Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70" }, { "ansible_version": "2.9.27", "end_time": "2026-03-18T23:56:16.751844Z", "host": "managed-node4", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "safe_mode": false, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [] } }, "leaves": [], "mounts": [], "msg": "encrypted volume 'test1' missing key/password", "packages": [], "pools": [], "volumes": [] }, "start_time": "2026-03-18T23:56:16.295932Z", "task_name": "Failed message", "task_path": "/tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111" }, { "ansible_version": "2.9.27", "end_time": "2026-03-19T00:02:02.406323Z", "host": "managed-node4", "message": "cannot remove existing formatting on device 'luks-64ca6b26-d808-4080-846a-f5a544a68787' in safe mode due to encryption removal", "start_time": "2026-03-19T00:01:56.441613Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70" }, { "ansible_version": "2.9.27", "end_time": "2026-03-19T00:02:02.991088Z", "host": "managed-node4", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "safe_mode": true, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [] } }, "leaves": [], "mounts": [], "msg": "cannot remove existing formatting on device 'luks-64ca6b26-d808-4080-846a-f5a544a68787' in safe mode due to encryption removal", "packages": [], "pools": [], "volumes": [] }, "start_time": "2026-03-19T00:02:02.484384Z", "task_name": "Failed message", "task_path": "/tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111" }, { "ansible_version": "2.9.27", "end_time": "2026-03-19T00:06:02.160586Z", "host": "managed-node4", "message": "cannot remove existing formatting on device 'test1' in safe mode due to adding encryption", "start_time": "2026-03-19T00:05:56.362202Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70" }, { "ansible_version": "2.9.27", "end_time": "2026-03-19T00:06:02.649915Z", "host": "managed-node4", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "safe_mode": true, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [] } }, "leaves": [], "mounts": [], "msg": "cannot remove existing formatting on device 'test1' in safe mode due to adding encryption", "packages": [], "pools": [], "volumes": [] }, "start_time": "2026-03-19T00:06:02.235592Z", "task_name": "Failed message", "task_path": "/tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111" } ] SYSTEM ROLES ERRORS END v1 TASKS RECAP ******************************************************************** Wednesday 18 March 2026 20:10:39 -0400 (0:00:00.298) 0:32:03.196 ******* =============================================================================== fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 66.56s /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 37.07s /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 fedora.linux_system_roles.storage : Include the appropriate provider tasks -- 17.22s /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 12.98s /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 12.89s /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 12.74s /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 12.10s /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 12.04s /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 11.99s /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 fedora.linux_system_roles.storage : Set platform/version specific variables --- 8.16s /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing --- 6.58s /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 6.47s /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 6.41s /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Gathering Facts --------------------------------------------------------- 6.33s /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:2 fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab --- 6.29s /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing --- 6.22s /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 fedora.linux_system_roles.storage : Get required packages --------------- 6.21s /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 fedora.linux_system_roles.storage : Get required packages --------------- 6.09s /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 6.09s /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 6.04s /tmp/collections-8tn/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70