ansible-playbook 2.9.27 config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.9/site-packages/ansible executable location = /usr/local/bin/ansible-playbook python version = 3.9.19 (main, May 16 2024, 11:40:09) [GCC 8.5.0 20210514 (Red Hat 8.5.0-22)] No config file found; using defaults [WARNING]: running playbook inside collection fedora.linux_system_roles statically imported: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml statically imported: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml statically imported: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml statically imported: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml statically imported: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml statically imported: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'jsonl', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: tests_luks.yml ******************************************************* 1 plays in /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml PLAY [Test LUKS] *************************************************************** TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:2 Monday 20 April 2026 15:44:56 -0400 (0:00:00.302) 0:00:00.302 ********** ok: [managed-node3] META: ran handlers TASK [Enable FIPS mode] ******************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:20 Monday 20 April 2026 15:45:00 -0400 (0:00:04.193) 0:00:04.495 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reboot] ****************************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:28 Monday 20 April 2026 15:45:00 -0400 (0:00:00.373) 0:00:04.869 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Enable FIPS mode - 2] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:39 Monday 20 April 2026 15:45:01 -0400 (0:00:00.259) 0:00:05.128 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reboot - 2] ************************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:43 Monday 20 April 2026 15:45:01 -0400 (0:00:00.621) 0:00:05.750 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure dracut-fips] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:53 Monday 20 April 2026 15:45:02 -0400 (0:00:00.428) 0:00:06.178 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Configure boot for FIPS] ************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:59 Monday 20 April 2026 15:45:02 -0400 (0:00:00.415) 0:00:06.594 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reboot - 3] ************************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:68 Monday 20 April 2026 15:45:03 -0400 (0:00:00.423) 0:00:07.017 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role] ************************************************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:72 Monday 20 April 2026 15:45:03 -0400 (0:00:00.507) 0:00:07.524 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node3 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Monday 20 April 2026 15:45:03 -0400 (0:00:00.256) 0:00:07.780 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Monday 20 April 2026 15:45:04 -0400 (0:00:00.386) 0:00:08.167 ********** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Monday 20 April 2026 15:45:05 -0400 (0:00:00.885) 0:00:09.053 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Monday 20 April 2026 15:45:05 -0400 (0:00:00.213) 0:00:09.267 ********** ok: [managed-node3] TASK [fedora.linux_system_roles.storage : Record role begin fingerprint] ******* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Monday 20 April 2026 15:45:07 -0400 (0:00:02.193) 0:00:11.461 ********** ok: [managed-node3] => { "changed": false } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:13 Monday 20 April 2026 15:45:09 -0400 (0:00:01.936) 0:00:13.397 ********** skipping: [managed-node3] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node3] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node3] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node3] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:31 Monday 20 April 2026 15:45:10 -0400 (0:00:00.607) 0:00:14.005 ********** ok: [managed-node3] => { "changed": false, "stat": { "exists": false } } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:36 Monday 20 April 2026 15:45:13 -0400 (0:00:03.006) 0:00:17.012 ********** ok: [managed-node3] => { "ansible_facts": { "__storage_is_ostree": false }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Monday 20 April 2026 15:45:13 -0400 (0:00:00.236) 0:00:17.248 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Monday 20 April 2026 15:45:13 -0400 (0:00:00.178) 0:00:17.426 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Monday 20 April 2026 15:45:13 -0400 (0:00:00.200) 0:00:17.627 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Monday 20 April 2026 15:45:14 -0400 (0:00:00.808) 0:00:18.436 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Monday 20 April 2026 15:45:14 -0400 (0:00:00.304) 0:00:18.740 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Monday 20 April 2026 15:45:15 -0400 (0:00:00.267) 0:00:19.008 ********** ok: [managed-node3] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Monday 20 April 2026 15:45:20 -0400 (0:00:05.559) 0:00:24.567 ********** ok: [managed-node3] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Monday 20 April 2026 15:45:20 -0400 (0:00:00.222) 0:00:24.790 ********** ok: [managed-node3] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Monday 20 April 2026 15:45:21 -0400 (0:00:00.160) 0:00:24.951 ********** ok: [managed-node3] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Monday 20 April 2026 15:45:24 -0400 (0:00:03.245) 0:00:28.196 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Monday 20 April 2026 15:45:24 -0400 (0:00:00.292) 0:00:28.489 ********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Monday 20 April 2026 15:45:24 -0400 (0:00:00.233) 0:00:28.722 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Monday 20 April 2026 15:45:24 -0400 (0:00:00.176) 0:00:28.898 ********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Monday 20 April 2026 15:45:25 -0400 (0:00:00.258) 0:00:29.157 ********** ok: [managed-node3] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Monday 20 April 2026 15:45:29 -0400 (0:00:04.356) 0:00:33.513 ********** ok: [managed-node3] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Monday 20 April 2026 15:45:34 -0400 (0:00:04.534) 0:00:38.048 ********** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Monday 20 April 2026 15:45:34 -0400 (0:00:00.529) 0:00:38.577 ********** ok: [managed-node3] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:103 Monday 20 April 2026 15:45:36 -0400 (0:00:01.645) 0:00:40.223 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:110 Monday 20 April 2026 15:45:36 -0400 (0:00:00.239) 0:00:40.463 ********** ok: [managed-node3] => { "changed": false, "stat": { "atime": 1776713920.2462032, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "ab8070345774adad92683e9645714452be7be474", "ctime": 1776713918.6812062, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 73400516, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1776713918.6812062, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1343, "uid": 0, "version": "319406980", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Monday 20 April 2026 15:45:37 -0400 (0:00:01.413) 0:00:41.876 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Monday 20 April 2026 15:45:38 -0400 (0:00:00.163) 0:00:42.039 ********** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:139 Monday 20 April 2026 15:45:38 -0400 (0:00:00.245) 0:00:42.285 ********** ok: [managed-node3] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Monday 20 April 2026 15:45:38 -0400 (0:00:00.233) 0:00:42.518 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:152 Monday 20 April 2026 15:45:38 -0400 (0:00:00.155) 0:00:42.674 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:168 Monday 20 April 2026 15:45:38 -0400 (0:00:00.162) 0:00:42.837 ********** TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:179 Monday 20 April 2026 15:45:39 -0400 (0:00:00.207) 0:00:43.044 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:184 Monday 20 April 2026 15:45:39 -0400 (0:00:00.097) 0:00:43.142 ********** TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Monday 20 April 2026 15:45:39 -0400 (0:00:00.075) 0:00:43.217 ********** TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:207 Monday 20 April 2026 15:45:39 -0400 (0:00:00.128) 0:00:43.345 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:215 Monday 20 April 2026 15:45:39 -0400 (0:00:00.086) 0:00:43.432 ********** ok: [managed-node3] => { "changed": false, "stat": { "atime": 1776713022.0129735, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1716968941.893, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 135, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1716968586.525, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "1157759751", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:220 Monday 20 April 2026 15:45:40 -0400 (0:00:00.904) 0:00:44.337 ********** TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:242 Monday 20 April 2026 15:45:40 -0400 (0:00:00.126) 0:00:44.463 ********** ok: [managed-node3] TASK [fedora.linux_system_roles.storage : Record role success fingerprint] ***** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:16 Monday 20 April 2026 15:45:41 -0400 (0:00:01.447) 0:00:45.910 ********** ok: [managed-node3] => { "changed": false } TASK [Get unused disks] ******************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:75 Monday 20 April 2026 15:45:43 -0400 (0:00:01.881) 0:00:47.792 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml for managed-node3 TASK [Ensure test packages] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:2 Monday 20 April 2026 15:45:44 -0400 (0:00:00.381) 0:00:48.174 ********** ok: [managed-node3] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Find unused disks in the system] ***************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:11 Monday 20 April 2026 15:45:48 -0400 (0:00:04.031) 0:00:52.205 ********** ok: [managed-node3] => { "changed": false, "disks": [ "sda" ], "info": [ "Line: NAME=\"/dev/sda\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdb\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdc\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdd\" TYPE=\"disk\" SIZE=\"1099511627776\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sde\" TYPE=\"disk\" SIZE=\"1099511627776\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdf\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdg\" TYPE=\"disk\" SIZE=\"1099511627776\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdh\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdi\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/xvda\" TYPE=\"disk\" SIZE=\"268435456000\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/xvda1\" TYPE=\"part\" SIZE=\"268434390528\" FSTYPE=\"xfs\" LOG-SEC=\"512\"", "Line type [part] is not disk: NAME=\"/dev/xvda1\" TYPE=\"part\" SIZE=\"268434390528\" FSTYPE=\"xfs\" LOG-SEC=\"512\"", "filename [xvda1] is a partition", "Disk [/dev/xvda] attrs [{'type': 'disk', 'size': '268435456000', 'fstype': '', 'ssize': '512'}] has partitions" ] } TASK [Debug why there are no unused disks] ************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:20 Monday 20 April 2026 15:45:52 -0400 (0:00:04.344) 0:00:56.550 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set unused_disks if necessary] ******************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:29 Monday 20 April 2026 15:45:52 -0400 (0:00:00.270) 0:00:56.821 ********** ok: [managed-node3] => { "ansible_facts": { "unused_disks": [ "sda" ] }, "changed": false } TASK [Exit playbook when there's not enough unused disks in the system] ******** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:34 Monday 20 April 2026 15:45:53 -0400 (0:00:00.339) 0:00:57.160 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print unused disks] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:39 Monday 20 April 2026 15:45:53 -0400 (0:00:00.217) 0:00:57.377 ********** ok: [managed-node3] => { "unused_disks": [ "sda" ] } TASK [Test for correct handling of new encrypted volume w/ no key] ************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:84 Monday 20 April 2026 15:45:53 -0400 (0:00:00.194) 0:00:57.571 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node3 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Monday 20 April 2026 15:45:53 -0400 (0:00:00.203) 0:00:57.775 ********** ok: [managed-node3] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Monday 20 April 2026 15:45:54 -0400 (0:00:00.175) 0:00:57.950 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node3 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Monday 20 April 2026 15:45:54 -0400 (0:00:00.339) 0:00:58.290 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Monday 20 April 2026 15:45:54 -0400 (0:00:00.191) 0:00:58.481 ********** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Monday 20 April 2026 15:45:54 -0400 (0:00:00.251) 0:00:58.733 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Monday 20 April 2026 15:45:55 -0400 (0:00:00.232) 0:00:58.966 ********** ok: [managed-node3] TASK [fedora.linux_system_roles.storage : Record role begin fingerprint] ******* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Monday 20 April 2026 15:45:57 -0400 (0:00:02.498) 0:01:01.464 ********** ok: [managed-node3] => { "changed": false } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:13 Monday 20 April 2026 15:45:58 -0400 (0:00:01.269) 0:01:02.734 ********** skipping: [managed-node3] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node3] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node3] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node3] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:31 Monday 20 April 2026 15:45:59 -0400 (0:00:00.364) 0:01:03.098 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:36 Monday 20 April 2026 15:45:59 -0400 (0:00:00.179) 0:01:03.278 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Monday 20 April 2026 15:45:59 -0400 (0:00:00.204) 0:01:03.482 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Monday 20 April 2026 15:45:59 -0400 (0:00:00.238) 0:01:03.721 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Monday 20 April 2026 15:45:59 -0400 (0:00:00.125) 0:01:03.846 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Monday 20 April 2026 15:46:00 -0400 (0:00:00.431) 0:01:04.278 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Monday 20 April 2026 15:46:00 -0400 (0:00:00.249) 0:01:04.528 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Monday 20 April 2026 15:46:00 -0400 (0:00:00.125) 0:01:04.653 ********** ok: [managed-node3] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Monday 20 April 2026 15:46:04 -0400 (0:00:04.223) 0:01:08.877 ********** ok: [managed-node3] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Monday 20 April 2026 15:46:05 -0400 (0:00:00.296) 0:01:09.174 ********** ok: [managed-node3] => { "storage_volumes | d([])": [ { "disks": [ "sda" ], "encryption": true, "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Monday 20 April 2026 15:46:05 -0400 (0:00:00.276) 0:01:09.450 ********** ok: [managed-node3] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Monday 20 April 2026 15:46:10 -0400 (0:00:04.855) 0:01:14.305 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Monday 20 April 2026 15:46:10 -0400 (0:00:00.281) 0:01:14.587 ********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Monday 20 April 2026 15:46:10 -0400 (0:00:00.130) 0:01:14.717 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Monday 20 April 2026 15:46:10 -0400 (0:00:00.177) 0:01:14.895 ********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Monday 20 April 2026 15:46:11 -0400 (0:00:00.158) 0:01:15.054 ********** ok: [managed-node3] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Monday 20 April 2026 15:46:15 -0400 (0:00:04.183) 0:01:19.237 ********** ok: [managed-node3] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Monday 20 April 2026 15:46:18 -0400 (0:00:02.837) 0:01:22.075 ********** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Monday 20 April 2026 15:46:18 -0400 (0:00:00.419) 0:01:22.494 ********** fatal: [managed-node3]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: encrypted volume 'foo' missing key/password TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129 Monday 20 April 2026 15:46:23 -0400 (0:00:05.449) 0:01:27.943 ********** fatal: [managed-node3]: FAILED! => { "changed": false } MSG: {'msg': "encrypted volume 'foo' missing key/password", 'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'invocation': {'module_args': {'pools': [], 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'foo', 'raid_level': None, 'size': 10737418240, 'state': 'present', 'type': 'disk', 'disks': ['sda'], 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_stripe_size': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': True, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Monday 20 April 2026 15:46:24 -0400 (0:00:00.325) 0:01:28.268 ********** TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:22 Monday 20 April 2026 15:46:24 -0400 (0:00:00.360) 0:01:28.629 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:27 Monday 20 April 2026 15:46:24 -0400 (0:00:00.157) 0:01:28.786 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:38 Monday 20 April 2026 15:46:25 -0400 (0:00:00.307) 0:01:29.094 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Create an encrypted disk volume w/ default fs] *************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:99 Monday 20 April 2026 15:46:25 -0400 (0:00:00.272) 0:01:29.366 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node3 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Monday 20 April 2026 15:46:25 -0400 (0:00:00.311) 0:01:29.678 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Monday 20 April 2026 15:46:25 -0400 (0:00:00.206) 0:01:29.884 ********** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Monday 20 April 2026 15:46:26 -0400 (0:00:00.216) 0:01:30.101 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Monday 20 April 2026 15:46:26 -0400 (0:00:00.128) 0:01:30.230 ********** ok: [managed-node3] TASK [fedora.linux_system_roles.storage : Record role begin fingerprint] ******* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Monday 20 April 2026 15:46:28 -0400 (0:00:02.272) 0:01:32.503 ********** ok: [managed-node3] => { "changed": false } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:13 Monday 20 April 2026 15:46:29 -0400 (0:00:01.372) 0:01:33.875 ********** skipping: [managed-node3] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node3] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node3] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node3] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:31 Monday 20 April 2026 15:46:30 -0400 (0:00:00.449) 0:01:34.324 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:36 Monday 20 April 2026 15:46:30 -0400 (0:00:00.262) 0:01:34.587 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Monday 20 April 2026 15:46:30 -0400 (0:00:00.157) 0:01:34.745 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Monday 20 April 2026 15:46:30 -0400 (0:00:00.138) 0:01:34.883 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Monday 20 April 2026 15:46:31 -0400 (0:00:00.157) 0:01:35.040 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Monday 20 April 2026 15:46:31 -0400 (0:00:00.386) 0:01:35.427 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Monday 20 April 2026 15:46:31 -0400 (0:00:00.177) 0:01:35.604 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Monday 20 April 2026 15:46:31 -0400 (0:00:00.281) 0:01:35.886 ********** ok: [managed-node3] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Monday 20 April 2026 15:46:36 -0400 (0:00:04.150) 0:01:40.036 ********** ok: [managed-node3] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Monday 20 April 2026 15:46:36 -0400 (0:00:00.177) 0:01:40.214 ********** ok: [managed-node3] => { "storage_volumes | d([])": [ { "disks": [ "sda" ], "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Monday 20 April 2026 15:46:36 -0400 (0:00:00.189) 0:01:40.404 ********** ok: [managed-node3] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Monday 20 April 2026 15:46:41 -0400 (0:00:05.120) 0:01:45.524 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Monday 20 April 2026 15:46:41 -0400 (0:00:00.283) 0:01:45.807 ********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Monday 20 April 2026 15:46:42 -0400 (0:00:00.282) 0:01:46.090 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Monday 20 April 2026 15:46:42 -0400 (0:00:00.266) 0:01:46.356 ********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Monday 20 April 2026 15:46:42 -0400 (0:00:00.229) 0:01:46.586 ********** ok: [managed-node3] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Monday 20 April 2026 15:46:46 -0400 (0:00:04.295) 0:01:50.881 ********** ok: [managed-node3] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Monday 20 April 2026 15:46:49 -0400 (0:00:02.897) 0:01:53.779 ********** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Monday 20 April 2026 15:46:50 -0400 (0:00:00.273) 0:01:54.053 ********** changed: [managed-node3] => { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-95fcf1a3-b37a-4b7a-9bf2-8d197d787770", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-95fcf1a3-b37a-4b7a-9bf2-8d197d787770", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-95fcf1a3-b37a-4b7a-9bf2-8d197d787770", "password": "-", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-95fcf1a3-b37a-4b7a-9bf2-8d197d787770" ], "mounts": [ { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-95fcf1a3-b37a-4b7a-9bf2-8d197d787770", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/mapper/luks-95fcf1a3-b37a-4b7a-9bf2-8d197d787770", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-95fcf1a3-b37a-4b7a-9bf2-8d197d787770", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:103 Monday 20 April 2026 15:47:04 -0400 (0:00:13.937) 0:02:07.990 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:110 Monday 20 April 2026 15:47:04 -0400 (0:00:00.302) 0:02:08.292 ********** ok: [managed-node3] => { "changed": false, "stat": { "atime": 1776713920.2462032, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "ab8070345774adad92683e9645714452be7be474", "ctime": 1776713918.6812062, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 73400516, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1776713918.6812062, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1343, "uid": 0, "version": "319406980", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Monday 20 April 2026 15:47:05 -0400 (0:00:01.570) 0:02:09.863 ********** ok: [managed-node3] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Monday 20 April 2026 15:47:08 -0400 (0:00:03.048) 0:02:12.912 ********** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:139 Monday 20 April 2026 15:47:09 -0400 (0:00:00.410) 0:02:13.322 ********** ok: [managed-node3] => { "blivet_output": { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-95fcf1a3-b37a-4b7a-9bf2-8d197d787770", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-95fcf1a3-b37a-4b7a-9bf2-8d197d787770", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-95fcf1a3-b37a-4b7a-9bf2-8d197d787770", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-95fcf1a3-b37a-4b7a-9bf2-8d197d787770" ], "mounts": [ { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-95fcf1a3-b37a-4b7a-9bf2-8d197d787770", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/mapper/luks-95fcf1a3-b37a-4b7a-9bf2-8d197d787770", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-95fcf1a3-b37a-4b7a-9bf2-8d197d787770", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Monday 20 April 2026 15:47:09 -0400 (0:00:00.250) 0:02:13.572 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:152 Monday 20 April 2026 15:47:09 -0400 (0:00:00.320) 0:02:13.893 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/mapper/luks-95fcf1a3-b37a-4b7a-9bf2-8d197d787770", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-95fcf1a3-b37a-4b7a-9bf2-8d197d787770", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:168 Monday 20 April 2026 15:47:10 -0400 (0:00:00.365) 0:02:14.259 ********** TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:179 Monday 20 April 2026 15:47:10 -0400 (0:00:00.167) 0:02:14.426 ********** ok: [managed-node3] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:184 Monday 20 April 2026 15:47:15 -0400 (0:00:05.008) 0:02:19.435 ********** changed: [managed-node3] => (item={'src': '/dev/mapper/luks-95fcf1a3-b37a-4b7a-9bf2-8d197d787770', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-95fcf1a3-b37a-4b7a-9bf2-8d197d787770", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-95fcf1a3-b37a-4b7a-9bf2-8d197d787770" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Monday 20 April 2026 15:47:19 -0400 (0:00:03.578) 0:02:23.013 ********** skipping: [managed-node3] => (item={'src': '/dev/mapper/luks-95fcf1a3-b37a-4b7a-9bf2-8d197d787770', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-95fcf1a3-b37a-4b7a-9bf2-8d197d787770", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:207 Monday 20 April 2026 15:47:19 -0400 (0:00:00.380) 0:02:23.394 ********** ok: [managed-node3] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:215 Monday 20 April 2026 15:47:21 -0400 (0:00:01.718) 0:02:25.112 ********** ok: [managed-node3] => { "changed": false, "stat": { "atime": 1776713022.0129735, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1716968941.893, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 135, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1716968586.525, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "1157759751", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:220 Monday 20 April 2026 15:47:22 -0400 (0:00:01.573) 0:02:26.686 ********** changed: [managed-node3] => (item={'backing_device': '/dev/sda', 'name': 'luks-95fcf1a3-b37a-4b7a-9bf2-8d197d787770', 'password': '-', 'state': 'present'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda", "name": "luks-95fcf1a3-b37a-4b7a-9bf2-8d197d787770", "password": "-", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:242 Monday 20 April 2026 15:47:24 -0400 (0:00:01.515) 0:02:28.201 ********** ok: [managed-node3] TASK [fedora.linux_system_roles.storage : Record role success fingerprint] ***** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:16 Monday 20 April 2026 15:47:26 -0400 (0:00:02.172) 0:02:30.374 ********** ok: [managed-node3] => { "changed": false } TASK [Verify role results] ***************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:110 Monday 20 April 2026 15:47:27 -0400 (0:00:01.514) 0:02:31.889 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node3 TASK [Print out pool information] ********************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Monday 20 April 2026 15:47:28 -0400 (0:00:00.355) 0:02:32.245 ********** skipping: [managed-node3] => {} TASK [Print out volume information] ******************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Monday 20 April 2026 15:47:28 -0400 (0:00:00.189) 0:02:32.434 ********** ok: [managed-node3] => { "_storage_volumes_list": [ { "_device": "/dev/mapper/luks-95fcf1a3-b37a-4b7a-9bf2-8d197d787770", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-95fcf1a3-b37a-4b7a-9bf2-8d197d787770", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Monday 20 April 2026 15:47:28 -0400 (0:00:00.211) 0:02:32.645 ********** ok: [managed-node3] => { "changed": false, "info": { "/dev/mapper/luks-95fcf1a3-b37a-4b7a-9bf2-8d197d787770": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-95fcf1a3-b37a-4b7a-9bf2-8d197d787770", "size": "10G", "type": "crypt", "uuid": "405b3eda-7c1d-4288-b2a5-c8c9aa1ed386" }, "/dev/sda": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "95fcf1a3-b37a-4b7a-9bf2-8d197d787770" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Monday 20 April 2026 15:47:31 -0400 (0:00:02.816) 0:02:35.462 ********** ok: [managed-node3] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002536", "end": "2026-04-20 15:47:34.211107", "rc": 0, "start": "2026-04-20 15:47:34.208571" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-95fcf1a3-b37a-4b7a-9bf2-8d197d787770 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Monday 20 April 2026 15:47:34 -0400 (0:00:02.941) 0:02:38.404 ********** ok: [managed-node3] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002541", "end": "2026-04-20 15:47:35.893260", "failed_when_result": false, "rc": 0, "start": "2026-04-20 15:47:35.890719" } STDOUT: luks-95fcf1a3-b37a-4b7a-9bf2-8d197d787770 /dev/sda - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Monday 20 April 2026 15:47:36 -0400 (0:00:01.716) 0:02:40.120 ********** TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Monday 20 April 2026 15:47:36 -0400 (0:00:00.230) 0:02:40.350 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node3 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Monday 20 April 2026 15:47:36 -0400 (0:00:00.393) 0:02:40.744 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Monday 20 April 2026 15:47:37 -0400 (0:00:00.231) 0:02:40.975 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node3 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node3 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node3 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node3 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node3 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node3 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node3 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node3 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Monday 20 April 2026 15:47:38 -0400 (0:00:01.281) 0:02:42.257 ********** ok: [managed-node3] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-95fcf1a3-b37a-4b7a-9bf2-8d197d787770" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Monday 20 April 2026 15:47:39 -0400 (0:00:00.739) 0:02:42.997 ********** ok: [managed-node3] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Monday 20 April 2026 15:47:39 -0400 (0:00:00.243) 0:02:43.240 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Monday 20 April 2026 15:47:39 -0400 (0:00:00.343) 0:02:43.584 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Monday 20 April 2026 15:47:39 -0400 (0:00:00.277) 0:02:43.862 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Monday 20 April 2026 15:47:40 -0400 (0:00:00.226) 0:02:44.089 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Monday 20 April 2026 15:47:40 -0400 (0:00:00.349) 0:02:44.439 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Monday 20 April 2026 15:47:40 -0400 (0:00:00.231) 0:02:44.670 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Monday 20 April 2026 15:47:41 -0400 (0:00:00.322) 0:02:44.992 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Monday 20 April 2026 15:47:41 -0400 (0:00:00.206) 0:02:45.199 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Monday 20 April 2026 15:47:41 -0400 (0:00:00.219) 0:02:45.418 ********** ok: [managed-node3] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Monday 20 April 2026 15:47:41 -0400 (0:00:00.256) 0:02:45.675 ********** ok: [managed-node3] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-95fcf1a3-b37a-4b7a-9bf2-8d197d787770 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Monday 20 April 2026 15:47:42 -0400 (0:00:00.435) 0:02:46.111 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Monday 20 April 2026 15:47:42 -0400 (0:00:00.303) 0:02:46.415 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Monday 20 April 2026 15:47:42 -0400 (0:00:00.214) 0:02:46.629 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Monday 20 April 2026 15:47:42 -0400 (0:00:00.187) 0:02:46.816 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Monday 20 April 2026 15:47:43 -0400 (0:00:00.242) 0:02:47.058 ********** ok: [managed-node3] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Monday 20 April 2026 15:47:43 -0400 (0:00:00.150) 0:02:47.209 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Monday 20 April 2026 15:47:43 -0400 (0:00:00.301) 0:02:47.510 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Monday 20 April 2026 15:47:43 -0400 (0:00:00.240) 0:02:47.751 ********** ok: [managed-node3] => { "changed": false, "stat": { "atime": 1776714423.6382053, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1776714423.6382053, "dev": 6, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 38585, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1776714423.6382053, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Monday 20 April 2026 15:47:45 -0400 (0:00:01.428) 0:02:49.179 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Monday 20 April 2026 15:47:45 -0400 (0:00:00.325) 0:02:49.505 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Monday 20 April 2026 15:47:45 -0400 (0:00:00.251) 0:02:49.756 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Monday 20 April 2026 15:47:46 -0400 (0:00:00.251) 0:02:50.008 ********** ok: [managed-node3] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Monday 20 April 2026 15:47:46 -0400 (0:00:00.280) 0:02:50.289 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Monday 20 April 2026 15:47:46 -0400 (0:00:00.267) 0:02:50.556 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Monday 20 April 2026 15:47:46 -0400 (0:00:00.366) 0:02:50.922 ********** ok: [managed-node3] => { "changed": false, "stat": { "atime": 1776714423.7612052, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1776714423.7612052, "dev": 6, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 187692, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1776714423.7612052, "nlink": 1, "path": "/dev/mapper/luks-95fcf1a3-b37a-4b7a-9bf2-8d197d787770", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Monday 20 April 2026 15:47:48 -0400 (0:00:01.473) 0:02:52.396 ********** ok: [managed-node3] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Monday 20 April 2026 15:47:52 -0400 (0:00:04.494) 0:02:56.890 ********** ok: [managed-node3] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/sda" ], "delta": "0:00:00.011641", "end": "2026-04-20 15:47:54.074002", "rc": 0, "start": "2026-04-20 15:47:54.062361" } STDOUT: LUKS header information Version: 2 Epoch: 3 Metadata area: 16384 [bytes] Keyslots area: 16744448 [bytes] UUID: 95fcf1a3-b37a-4b7a-9bf2-8d197d787770 Label: (no label) Subsystem: (no subsystem) Flags: (no flags) Data segments: 0: crypt offset: 16777216 [bytes] length: (whole device) cipher: aes-xts-plain64 sector: 512 [bytes] Keyslots: 0: luks2 Key: 512 bits Priority: normal Cipher: aes-xts-plain64 Cipher key: 512 bits PBKDF: argon2i Time cost: 4 Memory: 928876 Threads: 2 Salt: 4d c7 16 ee 8f 2c 4c d1 1a 67 1e d8 9f b4 9e 90 ba 66 26 ee 9f 30 fa 12 cb f8 84 28 49 86 b3 a9 AF stripes: 4000 AF hash: sha256 Area offset:32768 [bytes] Area length:258048 [bytes] Digest ID: 0 Tokens: Digests: 0: pbkdf2 Hash: sha256 Iterations: 120470 Salt: 4c 92 dc 7b 6f f8 03 4a f5 4c 41 f4 b9 76 e6 0f f6 22 ea e9 22 41 70 2c 0d a8 33 28 6e 15 d4 a0 Digest: 62 e3 00 db b1 b6 62 09 51 26 21 dc 6a 13 d3 02 19 ae d4 a6 ab 9d 21 0d eb 76 43 9f d1 a7 e7 9d TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Monday 20 April 2026 15:47:54 -0400 (0:00:01.358) 0:02:58.249 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Monday 20 April 2026 15:47:54 -0400 (0:00:00.382) 0:02:58.631 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Monday 20 April 2026 15:47:55 -0400 (0:00:00.356) 0:02:58.988 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Monday 20 April 2026 15:47:55 -0400 (0:00:00.275) 0:02:59.263 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Monday 20 April 2026 15:47:55 -0400 (0:00:00.299) 0:02:59.563 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Monday 20 April 2026 15:47:55 -0400 (0:00:00.252) 0:02:59.815 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Monday 20 April 2026 15:47:56 -0400 (0:00:00.259) 0:03:00.075 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Monday 20 April 2026 15:47:56 -0400 (0:00:00.252) 0:03:00.328 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-95fcf1a3-b37a-4b7a-9bf2-8d197d787770 /dev/sda -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Monday 20 April 2026 15:47:56 -0400 (0:00:00.496) 0:03:00.824 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Monday 20 April 2026 15:47:57 -0400 (0:00:00.338) 0:03:01.163 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Monday 20 April 2026 15:47:57 -0400 (0:00:00.243) 0:03:01.406 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Monday 20 April 2026 15:47:57 -0400 (0:00:00.300) 0:03:01.706 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Monday 20 April 2026 15:47:57 -0400 (0:00:00.200) 0:03:01.906 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Monday 20 April 2026 15:47:58 -0400 (0:00:00.247) 0:03:02.154 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Monday 20 April 2026 15:47:58 -0400 (0:00:00.256) 0:03:02.410 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Monday 20 April 2026 15:47:58 -0400 (0:00:00.350) 0:03:02.761 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Monday 20 April 2026 15:47:59 -0400 (0:00:00.282) 0:03:03.043 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Monday 20 April 2026 15:47:59 -0400 (0:00:00.242) 0:03:03.285 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Monday 20 April 2026 15:47:59 -0400 (0:00:00.273) 0:03:03.559 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Monday 20 April 2026 15:47:59 -0400 (0:00:00.300) 0:03:03.860 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Monday 20 April 2026 15:48:00 -0400 (0:00:00.203) 0:03:04.063 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Monday 20 April 2026 15:48:00 -0400 (0:00:00.186) 0:03:04.250 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Monday 20 April 2026 15:48:00 -0400 (0:00:00.265) 0:03:04.515 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Monday 20 April 2026 15:48:00 -0400 (0:00:00.223) 0:03:04.739 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Monday 20 April 2026 15:48:00 -0400 (0:00:00.203) 0:03:04.943 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Monday 20 April 2026 15:48:01 -0400 (0:00:00.199) 0:03:05.142 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Monday 20 April 2026 15:48:01 -0400 (0:00:00.202) 0:03:05.345 ********** ok: [managed-node3] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Monday 20 April 2026 15:48:01 -0400 (0:00:00.204) 0:03:05.550 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Monday 20 April 2026 15:48:01 -0400 (0:00:00.219) 0:03:05.769 ********** skipping: [managed-node3] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Monday 20 April 2026 15:48:02 -0400 (0:00:00.211) 0:03:05.980 ********** skipping: [managed-node3] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Monday 20 April 2026 15:48:02 -0400 (0:00:00.203) 0:03:06.184 ********** skipping: [managed-node3] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Monday 20 April 2026 15:48:02 -0400 (0:00:00.280) 0:03:06.464 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Monday 20 April 2026 15:48:02 -0400 (0:00:00.209) 0:03:06.674 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Monday 20 April 2026 15:48:03 -0400 (0:00:00.322) 0:03:06.997 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Monday 20 April 2026 15:48:03 -0400 (0:00:00.306) 0:03:07.303 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Monday 20 April 2026 15:48:03 -0400 (0:00:00.293) 0:03:07.597 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Monday 20 April 2026 15:48:03 -0400 (0:00:00.299) 0:03:07.897 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Monday 20 April 2026 15:48:04 -0400 (0:00:00.344) 0:03:08.241 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Monday 20 April 2026 15:48:04 -0400 (0:00:00.356) 0:03:08.597 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Monday 20 April 2026 15:48:05 -0400 (0:00:00.365) 0:03:08.963 ********** skipping: [managed-node3] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Monday 20 April 2026 15:48:05 -0400 (0:00:00.198) 0:03:09.161 ********** skipping: [managed-node3] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Monday 20 April 2026 15:48:05 -0400 (0:00:00.305) 0:03:09.467 ********** skipping: [managed-node3] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Monday 20 April 2026 15:48:05 -0400 (0:00:00.256) 0:03:09.723 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Monday 20 April 2026 15:48:06 -0400 (0:00:00.287) 0:03:10.011 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Monday 20 April 2026 15:48:06 -0400 (0:00:00.236) 0:03:10.247 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Monday 20 April 2026 15:48:06 -0400 (0:00:00.298) 0:03:10.546 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Monday 20 April 2026 15:48:06 -0400 (0:00:00.224) 0:03:10.771 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Monday 20 April 2026 15:48:07 -0400 (0:00:00.217) 0:03:10.988 ********** ok: [managed-node3] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Monday 20 April 2026 15:48:07 -0400 (0:00:00.253) 0:03:11.242 ********** ok: [managed-node3] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Monday 20 April 2026 15:48:07 -0400 (0:00:00.249) 0:03:11.491 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Monday 20 April 2026 15:48:07 -0400 (0:00:00.275) 0:03:11.767 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Monday 20 April 2026 15:48:08 -0400 (0:00:00.225) 0:03:11.992 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Monday 20 April 2026 15:48:08 -0400 (0:00:00.350) 0:03:12.342 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Monday 20 April 2026 15:48:08 -0400 (0:00:00.270) 0:03:12.613 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Monday 20 April 2026 15:48:08 -0400 (0:00:00.259) 0:03:12.872 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Monday 20 April 2026 15:48:09 -0400 (0:00:00.245) 0:03:13.117 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Monday 20 April 2026 15:48:09 -0400 (0:00:00.392) 0:03:13.510 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Monday 20 April 2026 15:48:09 -0400 (0:00:00.243) 0:03:13.753 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Monday 20 April 2026 15:48:09 -0400 (0:00:00.168) 0:03:13.921 ********** ok: [managed-node3] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Monday 20 April 2026 15:48:10 -0400 (0:00:00.228) 0:03:14.150 ********** changed: [managed-node3] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode] ********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:116 Monday 20 April 2026 15:48:13 -0400 (0:00:02.855) 0:03:17.006 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node3 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Monday 20 April 2026 15:48:13 -0400 (0:00:00.612) 0:03:17.618 ********** ok: [managed-node3] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Monday 20 April 2026 15:48:13 -0400 (0:00:00.298) 0:03:17.916 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node3 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Monday 20 April 2026 15:48:14 -0400 (0:00:00.336) 0:03:18.253 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Monday 20 April 2026 15:48:14 -0400 (0:00:00.279) 0:03:18.533 ********** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Monday 20 April 2026 15:48:14 -0400 (0:00:00.296) 0:03:18.829 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Monday 20 April 2026 15:48:15 -0400 (0:00:00.236) 0:03:19.066 ********** ok: [managed-node3] TASK [fedora.linux_system_roles.storage : Record role begin fingerprint] ******* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Monday 20 April 2026 15:48:17 -0400 (0:00:02.603) 0:03:21.670 ********** ok: [managed-node3] => { "changed": false } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:13 Monday 20 April 2026 15:48:19 -0400 (0:00:01.553) 0:03:23.224 ********** skipping: [managed-node3] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node3] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node3] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node3] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:31 Monday 20 April 2026 15:48:19 -0400 (0:00:00.367) 0:03:23.592 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:36 Monday 20 April 2026 15:48:19 -0400 (0:00:00.277) 0:03:23.869 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Monday 20 April 2026 15:48:20 -0400 (0:00:00.178) 0:03:24.048 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Monday 20 April 2026 15:48:20 -0400 (0:00:00.196) 0:03:24.244 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Monday 20 April 2026 15:48:20 -0400 (0:00:00.377) 0:03:24.621 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Monday 20 April 2026 15:48:21 -0400 (0:00:01.283) 0:03:25.905 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Monday 20 April 2026 15:48:22 -0400 (0:00:00.255) 0:03:26.160 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Monday 20 April 2026 15:48:22 -0400 (0:00:00.206) 0:03:26.366 ********** ok: [managed-node3] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Monday 20 April 2026 15:48:26 -0400 (0:00:04.491) 0:03:30.858 ********** ok: [managed-node3] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Monday 20 April 2026 15:48:27 -0400 (0:00:00.189) 0:03:31.047 ********** ok: [managed-node3] => { "storage_volumes | d([])": [ { "disks": [ "sda" ], "encryption": false, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Monday 20 April 2026 15:48:27 -0400 (0:00:00.269) 0:03:31.317 ********** ok: [managed-node3] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Monday 20 April 2026 15:48:32 -0400 (0:00:05.428) 0:03:36.745 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Monday 20 April 2026 15:48:32 -0400 (0:00:00.194) 0:03:36.939 ********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Monday 20 April 2026 15:48:33 -0400 (0:00:00.076) 0:03:37.016 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Monday 20 April 2026 15:48:33 -0400 (0:00:00.119) 0:03:37.136 ********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Monday 20 April 2026 15:48:33 -0400 (0:00:00.086) 0:03:37.222 ********** ok: [managed-node3] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Monday 20 April 2026 15:48:37 -0400 (0:00:03.791) 0:03:41.014 ********** ok: [managed-node3] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Monday 20 April 2026 15:48:39 -0400 (0:00:02.637) 0:03:43.651 ********** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Monday 20 April 2026 15:48:40 -0400 (0:00:00.384) 0:03:44.036 ********** fatal: [managed-node3]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'luks-95fcf1a3-b37a-4b7a-9bf2-8d197d787770' in safe mode due to encryption removal TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129 Monday 20 April 2026 15:48:45 -0400 (0:00:05.597) 0:03:49.634 ********** fatal: [managed-node3]: FAILED! => { "changed": false } MSG: {'msg': "cannot remove existing formatting on device 'luks-95fcf1a3-b37a-4b7a-9bf2-8d197d787770' in safe mode due to encryption removal", 'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'invocation': {'module_args': {'pools': [], 'volumes': [{'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'foo', 'raid_level': None, 'size': 10720641024, 'state': 'present', 'type': 'disk', 'disks': ['sda'], 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_stripe_size': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': True, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Monday 20 April 2026 15:48:45 -0400 (0:00:00.202) 0:03:49.837 ********** TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:22 Monday 20 April 2026 15:48:46 -0400 (0:00:00.319) 0:03:50.156 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:27 Monday 20 April 2026 15:48:46 -0400 (0:00:00.129) 0:03:50.286 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:38 Monday 20 April 2026 15:48:46 -0400 (0:00:00.240) 0:03:50.526 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Monday 20 April 2026 15:48:46 -0400 (0:00:00.192) 0:03:50.719 ********** ok: [managed-node3] => { "changed": false, "stat": { "atime": 1776714492.7530684, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1776714492.7530684, "dev": 64768, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1776714492.7530684, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "1158448998", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Monday 20 April 2026 15:48:48 -0400 (0:00:01.505) 0:03:52.224 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Remove the encryption layer] ********************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:136 Monday 20 April 2026 15:48:48 -0400 (0:00:00.210) 0:03:52.435 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node3 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Monday 20 April 2026 15:48:48 -0400 (0:00:00.383) 0:03:52.819 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Monday 20 April 2026 15:48:48 -0400 (0:00:00.112) 0:03:52.932 ********** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Monday 20 April 2026 15:48:49 -0400 (0:00:00.246) 0:03:53.178 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Monday 20 April 2026 15:48:49 -0400 (0:00:00.183) 0:03:53.362 ********** ok: [managed-node3] TASK [fedora.linux_system_roles.storage : Record role begin fingerprint] ******* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Monday 20 April 2026 15:48:51 -0400 (0:00:01.822) 0:03:55.185 ********** ok: [managed-node3] => { "changed": false } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:13 Monday 20 April 2026 15:48:52 -0400 (0:00:01.265) 0:03:56.450 ********** skipping: [managed-node3] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node3] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node3] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node3] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:31 Monday 20 April 2026 15:48:53 -0400 (0:00:00.510) 0:03:56.961 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:36 Monday 20 April 2026 15:48:53 -0400 (0:00:00.223) 0:03:57.184 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Monday 20 April 2026 15:48:53 -0400 (0:00:00.241) 0:03:57.426 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Monday 20 April 2026 15:48:53 -0400 (0:00:00.117) 0:03:57.543 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Monday 20 April 2026 15:48:53 -0400 (0:00:00.164) 0:03:57.708 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Monday 20 April 2026 15:48:54 -0400 (0:00:00.437) 0:03:58.145 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Monday 20 April 2026 15:48:54 -0400 (0:00:00.220) 0:03:58.366 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Monday 20 April 2026 15:48:54 -0400 (0:00:00.187) 0:03:58.553 ********** ok: [managed-node3] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Monday 20 April 2026 15:48:58 -0400 (0:00:04.285) 0:04:02.838 ********** ok: [managed-node3] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Monday 20 April 2026 15:48:59 -0400 (0:00:00.310) 0:04:03.148 ********** ok: [managed-node3] => { "storage_volumes | d([])": [ { "disks": [ "sda" ], "encryption": false, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Monday 20 April 2026 15:48:59 -0400 (0:00:00.217) 0:04:03.366 ********** ok: [managed-node3] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Monday 20 April 2026 15:49:04 -0400 (0:00:05.358) 0:04:08.724 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Monday 20 April 2026 15:49:04 -0400 (0:00:00.187) 0:04:08.912 ********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Monday 20 April 2026 15:49:05 -0400 (0:00:00.060) 0:04:08.973 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Monday 20 April 2026 15:49:05 -0400 (0:00:00.088) 0:04:09.062 ********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Monday 20 April 2026 15:49:05 -0400 (0:00:00.080) 0:04:09.142 ********** ok: [managed-node3] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Monday 20 April 2026 15:49:09 -0400 (0:00:04.282) 0:04:13.424 ********** ok: [managed-node3] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Monday 20 April 2026 15:49:12 -0400 (0:00:02.671) 0:04:16.096 ********** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Monday 20 April 2026 15:49:12 -0400 (0:00:00.233) 0:04:16.330 ********** changed: [managed-node3] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-95fcf1a3-b37a-4b7a-9bf2-8d197d787770", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-95fcf1a3-b37a-4b7a-9bf2-8d197d787770", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-95fcf1a3-b37a-4b7a-9bf2-8d197d787770", "password": "-", "state": "absent" } ], "leaves": [ "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-95fcf1a3-b37a-4b7a-9bf2-8d197d787770", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=e00121ad-995b-4c3e-9b3a-3774a767adfc", "state": "mounted" } ], "packages": [ "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=e00121ad-995b-4c3e-9b3a-3774a767adfc", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10720641024, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:103 Monday 20 April 2026 15:49:17 -0400 (0:00:05.389) 0:04:21.719 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:110 Monday 20 April 2026 15:49:17 -0400 (0:00:00.206) 0:04:21.926 ********** ok: [managed-node3] => { "changed": false, "stat": { "atime": 1776714438.7421753, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "8e5b7ab5a5e33406dc57267a9f5fe780ce65c9ba", "ctime": 1776714438.7391756, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 73400516, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1776714438.7391756, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1425, "uid": 0, "version": "319406980", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Monday 20 April 2026 15:49:19 -0400 (0:00:01.375) 0:04:23.301 ********** ok: [managed-node3] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Monday 20 April 2026 15:49:20 -0400 (0:00:01.489) 0:04:24.791 ********** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:139 Monday 20 April 2026 15:49:21 -0400 (0:00:00.271) 0:04:25.062 ********** ok: [managed-node3] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-95fcf1a3-b37a-4b7a-9bf2-8d197d787770", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-95fcf1a3-b37a-4b7a-9bf2-8d197d787770", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-95fcf1a3-b37a-4b7a-9bf2-8d197d787770", "password": "-", "state": "absent" } ], "failed": false, "leaves": [ "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-95fcf1a3-b37a-4b7a-9bf2-8d197d787770", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=e00121ad-995b-4c3e-9b3a-3774a767adfc", "state": "mounted" } ], "packages": [ "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=e00121ad-995b-4c3e-9b3a-3774a767adfc", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10720641024, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Monday 20 April 2026 15:49:21 -0400 (0:00:00.214) 0:04:25.277 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:152 Monday 20 April 2026 15:49:21 -0400 (0:00:00.274) 0:04:25.552 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=e00121ad-995b-4c3e-9b3a-3774a767adfc", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10720641024, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:168 Monday 20 April 2026 15:49:21 -0400 (0:00:00.250) 0:04:25.802 ********** changed: [managed-node3] => (item={'src': '/dev/mapper/luks-95fcf1a3-b37a-4b7a-9bf2-8d197d787770', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-95fcf1a3-b37a-4b7a-9bf2-8d197d787770", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-95fcf1a3-b37a-4b7a-9bf2-8d197d787770" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:179 Monday 20 April 2026 15:49:23 -0400 (0:00:01.336) 0:04:27.138 ********** ok: [managed-node3] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:184 Monday 20 April 2026 15:49:25 -0400 (0:00:01.922) 0:04:29.061 ********** changed: [managed-node3] => (item={'src': 'UUID=e00121ad-995b-4c3e-9b3a-3774a767adfc', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=e00121ad-995b-4c3e-9b3a-3774a767adfc", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=e00121ad-995b-4c3e-9b3a-3774a767adfc" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Monday 20 April 2026 15:49:26 -0400 (0:00:01.686) 0:04:30.748 ********** skipping: [managed-node3] => (item={'src': 'UUID=e00121ad-995b-4c3e-9b3a-3774a767adfc', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=e00121ad-995b-4c3e-9b3a-3774a767adfc", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:207 Monday 20 April 2026 15:49:27 -0400 (0:00:00.407) 0:04:31.156 ********** ok: [managed-node3] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:215 Monday 20 April 2026 15:49:29 -0400 (0:00:02.206) 0:04:33.362 ********** ok: [managed-node3] => { "changed": false, "stat": { "atime": 1776714455.8921413, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "3856c90b4d6973224bdd9ba06919808d10eee402", "ctime": 1776714443.9581652, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 10485959, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1776714443.956165, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 53, "uid": 0, "version": "3154363005", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:220 Monday 20 April 2026 15:49:30 -0400 (0:00:01.478) 0:04:34.841 ********** changed: [managed-node3] => (item={'backing_device': '/dev/sda', 'name': 'luks-95fcf1a3-b37a-4b7a-9bf2-8d197d787770', 'password': '-', 'state': 'absent'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda", "name": "luks-95fcf1a3-b37a-4b7a-9bf2-8d197d787770", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:242 Monday 20 April 2026 15:49:32 -0400 (0:00:01.890) 0:04:36.732 ********** ok: [managed-node3] TASK [fedora.linux_system_roles.storage : Record role success fingerprint] ***** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:16 Monday 20 April 2026 15:49:34 -0400 (0:00:02.046) 0:04:38.778 ********** ok: [managed-node3] => { "changed": false } TASK [Verify role results - 2] ************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:148 Monday 20 April 2026 15:49:36 -0400 (0:00:01.562) 0:04:40.340 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node3 TASK [Print out pool information] ********************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Monday 20 April 2026 15:49:36 -0400 (0:00:00.509) 0:04:40.850 ********** skipping: [managed-node3] => {} TASK [Print out volume information] ******************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Monday 20 April 2026 15:49:37 -0400 (0:00:00.246) 0:04:41.096 ********** ok: [managed-node3] => { "_storage_volumes_list": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=e00121ad-995b-4c3e-9b3a-3774a767adfc", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10720641024, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Monday 20 April 2026 15:49:37 -0400 (0:00:00.304) 0:04:41.401 ********** ok: [managed-node3] => { "changed": false, "info": { "/dev/sda": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "e00121ad-995b-4c3e-9b3a-3774a767adfc" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Monday 20 April 2026 15:49:39 -0400 (0:00:01.685) 0:04:43.087 ********** ok: [managed-node3] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002582", "end": "2026-04-20 15:49:40.522545", "rc": 0, "start": "2026-04-20 15:49:40.519963" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 UUID=e00121ad-995b-4c3e-9b3a-3774a767adfc /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Monday 20 April 2026 15:49:40 -0400 (0:00:01.670) 0:04:44.757 ********** ok: [managed-node3] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002451", "end": "2026-04-20 15:49:42.370223", "failed_when_result": false, "rc": 0, "start": "2026-04-20 15:49:42.367772" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Monday 20 April 2026 15:49:42 -0400 (0:00:01.880) 0:04:46.637 ********** TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Monday 20 April 2026 15:49:42 -0400 (0:00:00.190) 0:04:46.828 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node3 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Monday 20 April 2026 15:49:43 -0400 (0:00:00.462) 0:04:47.290 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Monday 20 April 2026 15:49:43 -0400 (0:00:00.223) 0:04:47.514 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node3 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node3 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node3 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node3 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node3 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node3 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node3 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node3 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Monday 20 April 2026 15:49:44 -0400 (0:00:01.155) 0:04:48.669 ********** ok: [managed-node3] => { "ansible_facts": { "storage_test_device_path": "/dev/sda" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Monday 20 April 2026 15:49:45 -0400 (0:00:00.324) 0:04:48.993 ********** ok: [managed-node3] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Monday 20 April 2026 15:49:45 -0400 (0:00:00.428) 0:04:49.422 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Monday 20 April 2026 15:49:45 -0400 (0:00:00.358) 0:04:49.781 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Monday 20 April 2026 15:49:46 -0400 (0:00:00.205) 0:04:49.986 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Monday 20 April 2026 15:49:46 -0400 (0:00:00.256) 0:04:50.243 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Monday 20 April 2026 15:49:46 -0400 (0:00:00.313) 0:04:50.556 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Monday 20 April 2026 15:49:46 -0400 (0:00:00.312) 0:04:50.869 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Monday 20 April 2026 15:49:47 -0400 (0:00:00.361) 0:04:51.230 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Monday 20 April 2026 15:49:47 -0400 (0:00:00.321) 0:04:51.552 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Monday 20 April 2026 15:49:47 -0400 (0:00:00.244) 0:04:51.797 ********** ok: [managed-node3] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Monday 20 April 2026 15:49:48 -0400 (0:00:00.188) 0:04:51.986 ********** ok: [managed-node3] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "UUID=e00121ad-995b-4c3e-9b3a-3774a767adfc " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Monday 20 April 2026 15:49:48 -0400 (0:00:00.541) 0:04:52.527 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Monday 20 April 2026 15:49:48 -0400 (0:00:00.355) 0:04:52.883 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Monday 20 April 2026 15:49:49 -0400 (0:00:00.582) 0:04:53.465 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Monday 20 April 2026 15:49:49 -0400 (0:00:00.229) 0:04:53.694 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Monday 20 April 2026 15:49:50 -0400 (0:00:00.331) 0:04:54.025 ********** ok: [managed-node3] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Monday 20 April 2026 15:49:50 -0400 (0:00:00.139) 0:04:54.165 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Monday 20 April 2026 15:49:50 -0400 (0:00:00.379) 0:04:54.544 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Monday 20 April 2026 15:49:50 -0400 (0:00:00.364) 0:04:54.908 ********** ok: [managed-node3] => { "changed": false, "stat": { "atime": 1776714557.5569398, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1776714557.5569398, "dev": 6, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 38585, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1776714557.5569398, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Monday 20 April 2026 15:49:52 -0400 (0:00:01.371) 0:04:56.280 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Monday 20 April 2026 15:49:52 -0400 (0:00:00.278) 0:04:56.559 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Monday 20 April 2026 15:49:52 -0400 (0:00:00.293) 0:04:56.852 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Monday 20 April 2026 15:49:53 -0400 (0:00:00.308) 0:04:57.161 ********** ok: [managed-node3] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Monday 20 April 2026 15:49:53 -0400 (0:00:00.331) 0:04:57.492 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Monday 20 April 2026 15:49:53 -0400 (0:00:00.282) 0:04:57.775 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Monday 20 April 2026 15:49:54 -0400 (0:00:00.297) 0:04:58.073 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Monday 20 April 2026 15:49:54 -0400 (0:00:00.273) 0:04:58.346 ********** ok: [managed-node3] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Monday 20 April 2026 15:49:58 -0400 (0:00:04.271) 0:05:02.617 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Monday 20 April 2026 15:49:58 -0400 (0:00:00.208) 0:05:02.826 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Monday 20 April 2026 15:49:59 -0400 (0:00:00.279) 0:05:03.105 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Monday 20 April 2026 15:49:59 -0400 (0:00:00.427) 0:05:03.533 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Monday 20 April 2026 15:49:59 -0400 (0:00:00.260) 0:05:03.793 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Monday 20 April 2026 15:50:00 -0400 (0:00:00.416) 0:05:04.210 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Monday 20 April 2026 15:50:00 -0400 (0:00:00.299) 0:05:04.510 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Monday 20 April 2026 15:50:00 -0400 (0:00:00.272) 0:05:04.782 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Monday 20 April 2026 15:50:01 -0400 (0:00:00.298) 0:05:05.081 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Monday 20 April 2026 15:50:01 -0400 (0:00:00.355) 0:05:05.436 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Monday 20 April 2026 15:50:01 -0400 (0:00:00.317) 0:05:05.754 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Monday 20 April 2026 15:50:02 -0400 (0:00:00.329) 0:05:06.083 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Monday 20 April 2026 15:50:02 -0400 (0:00:00.312) 0:05:06.396 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Monday 20 April 2026 15:50:02 -0400 (0:00:00.283) 0:05:06.679 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Monday 20 April 2026 15:50:02 -0400 (0:00:00.188) 0:05:06.868 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Monday 20 April 2026 15:50:03 -0400 (0:00:00.248) 0:05:07.117 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Monday 20 April 2026 15:50:03 -0400 (0:00:00.261) 0:05:07.378 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Monday 20 April 2026 15:50:03 -0400 (0:00:00.277) 0:05:07.655 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Monday 20 April 2026 15:50:04 -0400 (0:00:00.335) 0:05:07.991 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Monday 20 April 2026 15:50:04 -0400 (0:00:00.255) 0:05:08.247 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Monday 20 April 2026 15:50:04 -0400 (0:00:00.279) 0:05:08.527 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Monday 20 April 2026 15:50:04 -0400 (0:00:00.371) 0:05:08.898 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Monday 20 April 2026 15:50:05 -0400 (0:00:00.239) 0:05:09.137 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Monday 20 April 2026 15:50:05 -0400 (0:00:00.368) 0:05:09.506 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Monday 20 April 2026 15:50:05 -0400 (0:00:00.329) 0:05:09.835 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Monday 20 April 2026 15:50:06 -0400 (0:00:00.303) 0:05:10.139 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Monday 20 April 2026 15:50:06 -0400 (0:00:00.272) 0:05:10.411 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Monday 20 April 2026 15:50:06 -0400 (0:00:00.224) 0:05:10.636 ********** ok: [managed-node3] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Monday 20 April 2026 15:50:06 -0400 (0:00:00.260) 0:05:10.897 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Monday 20 April 2026 15:50:07 -0400 (0:00:00.242) 0:05:11.140 ********** skipping: [managed-node3] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Monday 20 April 2026 15:50:07 -0400 (0:00:00.237) 0:05:11.377 ********** skipping: [managed-node3] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Monday 20 April 2026 15:50:07 -0400 (0:00:00.206) 0:05:11.583 ********** skipping: [managed-node3] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Monday 20 April 2026 15:50:07 -0400 (0:00:00.231) 0:05:11.815 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Monday 20 April 2026 15:50:08 -0400 (0:00:00.258) 0:05:12.074 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Monday 20 April 2026 15:50:08 -0400 (0:00:00.287) 0:05:12.361 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Monday 20 April 2026 15:50:08 -0400 (0:00:00.208) 0:05:12.570 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Monday 20 April 2026 15:50:08 -0400 (0:00:00.334) 0:05:12.904 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Monday 20 April 2026 15:50:09 -0400 (0:00:00.155) 0:05:13.059 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Monday 20 April 2026 15:50:09 -0400 (0:00:00.345) 0:05:13.405 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Monday 20 April 2026 15:50:09 -0400 (0:00:00.197) 0:05:13.602 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Monday 20 April 2026 15:50:09 -0400 (0:00:00.265) 0:05:13.867 ********** skipping: [managed-node3] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Monday 20 April 2026 15:50:10 -0400 (0:00:00.245) 0:05:14.112 ********** skipping: [managed-node3] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Monday 20 April 2026 15:50:10 -0400 (0:00:00.277) 0:05:14.390 ********** skipping: [managed-node3] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Monday 20 April 2026 15:50:10 -0400 (0:00:00.232) 0:05:14.622 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Monday 20 April 2026 15:50:11 -0400 (0:00:00.358) 0:05:14.981 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Monday 20 April 2026 15:50:11 -0400 (0:00:00.277) 0:05:15.258 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Monday 20 April 2026 15:50:11 -0400 (0:00:00.255) 0:05:15.514 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Monday 20 April 2026 15:50:11 -0400 (0:00:00.198) 0:05:15.712 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Monday 20 April 2026 15:50:12 -0400 (0:00:00.249) 0:05:15.962 ********** ok: [managed-node3] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Monday 20 April 2026 15:50:12 -0400 (0:00:00.314) 0:05:16.276 ********** ok: [managed-node3] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Monday 20 April 2026 15:50:12 -0400 (0:00:00.355) 0:05:16.632 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Monday 20 April 2026 15:50:12 -0400 (0:00:00.179) 0:05:16.811 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Monday 20 April 2026 15:50:13 -0400 (0:00:00.323) 0:05:17.134 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Monday 20 April 2026 15:50:13 -0400 (0:00:00.198) 0:05:17.333 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Monday 20 April 2026 15:50:13 -0400 (0:00:00.255) 0:05:17.588 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Monday 20 April 2026 15:50:13 -0400 (0:00:00.232) 0:05:17.821 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Monday 20 April 2026 15:50:14 -0400 (0:00:00.349) 0:05:18.170 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Monday 20 April 2026 15:50:14 -0400 (0:00:00.347) 0:05:18.518 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Monday 20 April 2026 15:50:14 -0400 (0:00:00.419) 0:05:18.937 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Monday 20 April 2026 15:50:15 -0400 (0:00:00.236) 0:05:19.173 ********** ok: [managed-node3] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Monday 20 April 2026 15:50:15 -0400 (0:00:00.206) 0:05:19.380 ********** changed: [managed-node3] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode - 2] ****************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:154 Monday 20 April 2026 15:50:16 -0400 (0:00:01.104) 0:05:20.484 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node3 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Monday 20 April 2026 15:50:16 -0400 (0:00:00.412) 0:05:20.897 ********** ok: [managed-node3] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Monday 20 April 2026 15:50:17 -0400 (0:00:00.271) 0:05:21.169 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node3 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Monday 20 April 2026 15:50:17 -0400 (0:00:00.307) 0:05:21.477 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Monday 20 April 2026 15:50:17 -0400 (0:00:00.124) 0:05:21.601 ********** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Monday 20 April 2026 15:50:17 -0400 (0:00:00.332) 0:05:21.933 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Monday 20 April 2026 15:50:18 -0400 (0:00:00.243) 0:05:22.177 ********** ok: [managed-node3] TASK [fedora.linux_system_roles.storage : Record role begin fingerprint] ******* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Monday 20 April 2026 15:50:20 -0400 (0:00:02.704) 0:05:24.881 ********** ok: [managed-node3] => { "changed": false } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:13 Monday 20 April 2026 15:50:22 -0400 (0:00:01.735) 0:05:26.616 ********** skipping: [managed-node3] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node3] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node3] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node3] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:31 Monday 20 April 2026 15:50:23 -0400 (0:00:00.568) 0:05:27.185 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:36 Monday 20 April 2026 15:50:23 -0400 (0:00:00.334) 0:05:27.520 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Monday 20 April 2026 15:50:23 -0400 (0:00:00.338) 0:05:27.858 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Monday 20 April 2026 15:50:24 -0400 (0:00:00.216) 0:05:28.074 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Monday 20 April 2026 15:50:24 -0400 (0:00:00.217) 0:05:28.292 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Monday 20 April 2026 15:50:24 -0400 (0:00:00.550) 0:05:28.843 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Monday 20 April 2026 15:50:25 -0400 (0:00:00.371) 0:05:29.214 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Monday 20 April 2026 15:50:25 -0400 (0:00:00.192) 0:05:29.407 ********** ok: [managed-node3] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Monday 20 April 2026 15:50:29 -0400 (0:00:04.405) 0:05:33.812 ********** ok: [managed-node3] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Monday 20 April 2026 15:50:30 -0400 (0:00:00.306) 0:05:34.119 ********** ok: [managed-node3] => { "storage_volumes | d([])": [ { "disks": [ "sda" ], "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Monday 20 April 2026 15:50:30 -0400 (0:00:00.295) 0:05:34.415 ********** ok: [managed-node3] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Monday 20 April 2026 15:50:36 -0400 (0:00:05.717) 0:05:40.132 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Monday 20 April 2026 15:50:36 -0400 (0:00:00.310) 0:05:40.443 ********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Monday 20 April 2026 15:50:36 -0400 (0:00:00.166) 0:05:40.610 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Monday 20 April 2026 15:50:36 -0400 (0:00:00.212) 0:05:40.822 ********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Monday 20 April 2026 15:50:37 -0400 (0:00:00.177) 0:05:41.000 ********** ok: [managed-node3] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Monday 20 April 2026 15:50:41 -0400 (0:00:04.478) 0:05:45.478 ********** ok: [managed-node3] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luk...db37a\\x2d4b7a\\x2d9bf2\\x2d8d197d787770.service": { "name": "systemd-cryptsetup@luk...db37a\\x2d4b7a\\x2d9bf2\\x2d8d197d787770.service", "source": "systemd", "state": "inactive", "status": "generated" }, "systemd-cryptsetup@luks\\x2d95fcf1a3\\x2db37a\\x2d4b7a\\x2d9bf2\\x2d8d197d787770.service": { "name": "systemd-cryptsetup@luks\\x2d95fcf1a3\\x2db37a\\x2d4b7a\\x2d9bf2\\x2d8d197d787770.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Monday 20 April 2026 15:50:44 -0400 (0:00:02.753) 0:05:48.232 ********** changed: [managed-node3] => (item=systemd-cryptsetup@luks\x2d95fcf1a3\x2db37a\x2d4b7a\x2d9bf2\x2d8d197d787770.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d95fcf1a3\\x2db37a\\x2d4b7a\\x2d9bf2\\x2d8d197d787770.service", "name": "systemd-cryptsetup@luks\\x2d95fcf1a3\\x2db37a\\x2d4b7a\\x2d9bf2\\x2d8d197d787770.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "cryptsetup-pre.target systemd-journald.socket dev-sda.device system-systemd\\x2dcryptsetup.slice", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "umount.target cryptsetup.target", "BindsTo": "dev-sda.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-95fcf1a3-b37a-4b7a-9bf2-8d197d787770", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-95fcf1a3-b37a-4b7a-9bf2-8d197d787770 /dev/sda - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-95fcf1a3-b37a-4b7a-9bf2-8d197d787770 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d95fcf1a3\\x2db37a\\x2d4b7a\\x2d9bf2\\x2d8d197d787770.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2d95fcf1a3\\x2db37a\\x2d4b7a\\x2d9bf2\\x2d8d197d787770.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2d95fcf1a3\\x2db37a\\x2d4b7a\\x2d9bf2\\x2d8d197d787770.service", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Mon 2026-04-20 15:49:29 EDT", "StateChangeTimestampMonotonic": "6931114087", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "infinity", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "WantedBy": "dev-sda.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node3] => (item=systemd-cryptsetup@luk...db37a\x2d4b7a\x2d9bf2\x2d8d197d787770.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...db37a\\x2d4b7a\\x2d9bf2\\x2d8d197d787770.service", "name": "systemd-cryptsetup@luk...db37a\\x2d4b7a\\x2d9bf2\\x2d8d197d787770.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...db37a\\x2d4b7a\\x2d9bf2\\x2d8d197d787770.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...db37a\\x2d4b7a\\x2d9bf2\\x2d8d197d787770.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.NoSuchUnit \"Unit systemd-cryptsetup@luk...db37a\\x2d4b7a\\x2d9bf2\\x2d8d197d787770.service not found.\"", "LoadState": "not-found", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...db37a\\x2d4b7a\\x2d9bf2\\x2d8d197d787770.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Monday 20 April 2026 15:50:46 -0400 (0:00:02.712) 0:05:50.944 ********** fatal: [managed-node3]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'sda' in safe mode due to adding encryption TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129 Monday 20 April 2026 15:50:51 -0400 (0:00:04.818) 0:05:55.763 ********** fatal: [managed-node3]: FAILED! => { "changed": false } MSG: {'msg': "cannot remove existing formatting on device 'sda' in safe mode due to adding encryption", 'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'invocation': {'module_args': {'pools': [], 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'foo', 'raid_level': None, 'size': 10737418240, 'state': 'present', 'type': 'disk', 'disks': ['sda'], 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_stripe_size': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': True, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Monday 20 April 2026 15:50:52 -0400 (0:00:00.463) 0:05:56.226 ********** changed: [managed-node3] => (item=systemd-cryptsetup@luks\x2d95fcf1a3\x2db37a\x2d4b7a\x2d9bf2\x2d8d197d787770.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d95fcf1a3\\x2db37a\\x2d4b7a\\x2d9bf2\\x2d8d197d787770.service", "name": "systemd-cryptsetup@luks\\x2d95fcf1a3\\x2db37a\\x2d4b7a\\x2d9bf2\\x2d8d197d787770.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d95fcf1a3\\x2db37a\\x2d4b7a\\x2d9bf2\\x2d8d197d787770.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2d95fcf1a3\\x2db37a\\x2d4b7a\\x2d9bf2\\x2d8d197d787770.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2d95fcf1a3\\x2db37a\\x2d4b7a\\x2d9bf2\\x2d8d197d787770.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2d95fcf1a3\\x2db37a\\x2d4b7a\\x2d9bf2\\x2d8d197d787770.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node3] => (item=systemd-cryptsetup@luk...db37a\x2d4b7a\x2d9bf2\x2d8d197d787770.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...db37a\\x2d4b7a\\x2d9bf2\\x2d8d197d787770.service", "name": "systemd-cryptsetup@luk...db37a\\x2d4b7a\\x2d9bf2\\x2d8d197d787770.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...db37a\\x2d4b7a\\x2d9bf2\\x2d8d197d787770.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...db37a\\x2d4b7a\\x2d9bf2\\x2d8d197d787770.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luk...db37a\\x2d4b7a\\x2d9bf2\\x2d8d197d787770.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...db37a\\x2d4b7a\\x2d9bf2\\x2d8d197d787770.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:22 Monday 20 April 2026 15:50:55 -0400 (0:00:03.475) 0:05:59.702 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:27 Monday 20 April 2026 15:50:55 -0400 (0:00:00.219) 0:05:59.922 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:38 Monday 20 April 2026 15:50:56 -0400 (0:00:00.131) 0:06:00.053 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Monday 20 April 2026 15:50:56 -0400 (0:00:00.166) 0:06:00.220 ********** ok: [managed-node3] => { "changed": false, "stat": { "atime": 1776714616.3218234, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1776714616.3218234, "dev": 2048, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1776714616.3218234, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "2606631326", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Monday 20 April 2026 15:50:57 -0400 (0:00:01.172) 0:06:01.393 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Add encryption to the volume] ******************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:174 Monday 20 April 2026 15:50:57 -0400 (0:00:00.079) 0:06:01.472 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node3 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Monday 20 April 2026 15:50:57 -0400 (0:00:00.202) 0:06:01.674 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Monday 20 April 2026 15:50:57 -0400 (0:00:00.247) 0:06:01.922 ********** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Monday 20 April 2026 15:50:58 -0400 (0:00:00.249) 0:06:02.171 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Monday 20 April 2026 15:50:58 -0400 (0:00:00.269) 0:06:02.440 ********** ok: [managed-node3] TASK [fedora.linux_system_roles.storage : Record role begin fingerprint] ******* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Monday 20 April 2026 15:51:00 -0400 (0:00:01.793) 0:06:04.234 ********** ok: [managed-node3] => { "changed": false } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:13 Monday 20 April 2026 15:51:01 -0400 (0:00:01.159) 0:06:05.394 ********** skipping: [managed-node3] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node3] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node3] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node3] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:31 Monday 20 April 2026 15:51:01 -0400 (0:00:00.264) 0:06:05.658 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:36 Monday 20 April 2026 15:51:01 -0400 (0:00:00.037) 0:06:05.696 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Monday 20 April 2026 15:51:01 -0400 (0:00:00.188) 0:06:05.885 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Monday 20 April 2026 15:51:02 -0400 (0:00:00.118) 0:06:06.003 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Monday 20 April 2026 15:51:02 -0400 (0:00:00.055) 0:06:06.058 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Monday 20 April 2026 15:51:02 -0400 (0:00:00.188) 0:06:06.247 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Monday 20 April 2026 15:51:02 -0400 (0:00:00.083) 0:06:06.330 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Monday 20 April 2026 15:51:02 -0400 (0:00:00.109) 0:06:06.440 ********** ok: [managed-node3] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Monday 20 April 2026 15:51:06 -0400 (0:00:03.990) 0:06:10.430 ********** ok: [managed-node3] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Monday 20 April 2026 15:51:06 -0400 (0:00:00.239) 0:06:10.670 ********** ok: [managed-node3] => { "storage_volumes | d([])": [ { "disks": [ "sda" ], "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Monday 20 April 2026 15:51:06 -0400 (0:00:00.196) 0:06:10.866 ********** ok: [managed-node3] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Monday 20 April 2026 15:51:11 -0400 (0:00:04.934) 0:06:15.801 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Monday 20 April 2026 15:51:12 -0400 (0:00:00.288) 0:06:16.090 ********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Monday 20 April 2026 15:51:12 -0400 (0:00:00.147) 0:06:16.238 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Monday 20 April 2026 15:51:12 -0400 (0:00:00.222) 0:06:16.460 ********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Monday 20 April 2026 15:51:12 -0400 (0:00:00.183) 0:06:16.643 ********** ok: [managed-node3] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Monday 20 April 2026 15:51:17 -0400 (0:00:04.455) 0:06:21.099 ********** ok: [managed-node3] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Monday 20 April 2026 15:51:19 -0400 (0:00:02.662) 0:06:23.761 ********** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Monday 20 April 2026 15:51:20 -0400 (0:00:00.305) 0:06:24.066 ********** changed: [managed-node3] => { "actions": [ { "action": "destroy format", "device": "/dev/sda", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-a2c66ca8-a44f-4b1a-8c4e-7eafef8112f3", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-a2c66ca8-a44f-4b1a-8c4e-7eafef8112f3", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-a2c66ca8-a44f-4b1a-8c4e-7eafef8112f3", "password": "-", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-a2c66ca8-a44f-4b1a-8c4e-7eafef8112f3" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=e00121ad-995b-4c3e-9b3a-3774a767adfc", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-a2c66ca8-a44f-4b1a-8c4e-7eafef8112f3", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/mapper/luks-a2c66ca8-a44f-4b1a-8c4e-7eafef8112f3", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-a2c66ca8-a44f-4b1a-8c4e-7eafef8112f3", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:103 Monday 20 April 2026 15:51:33 -0400 (0:00:13.683) 0:06:37.750 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:110 Monday 20 April 2026 15:51:34 -0400 (0:00:00.272) 0:06:38.023 ********** ok: [managed-node3] => { "changed": false, "stat": { "atime": 1776714566.5069222, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "4c526acee503b534adb08660d0fe377b5975b4de", "ctime": 1776714566.5049222, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 73400516, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1776714566.5049222, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1413, "uid": 0, "version": "319406980", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Monday 20 April 2026 15:51:35 -0400 (0:00:01.689) 0:06:39.712 ********** ok: [managed-node3] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Monday 20 April 2026 15:51:37 -0400 (0:00:01.832) 0:06:41.545 ********** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:139 Monday 20 April 2026 15:51:37 -0400 (0:00:00.355) 0:06:41.901 ********** ok: [managed-node3] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/sda", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-a2c66ca8-a44f-4b1a-8c4e-7eafef8112f3", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-a2c66ca8-a44f-4b1a-8c4e-7eafef8112f3", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-a2c66ca8-a44f-4b1a-8c4e-7eafef8112f3", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-a2c66ca8-a44f-4b1a-8c4e-7eafef8112f3" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=e00121ad-995b-4c3e-9b3a-3774a767adfc", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-a2c66ca8-a44f-4b1a-8c4e-7eafef8112f3", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/mapper/luks-a2c66ca8-a44f-4b1a-8c4e-7eafef8112f3", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-a2c66ca8-a44f-4b1a-8c4e-7eafef8112f3", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Monday 20 April 2026 15:51:38 -0400 (0:00:00.281) 0:06:42.182 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:152 Monday 20 April 2026 15:51:38 -0400 (0:00:00.263) 0:06:42.446 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/mapper/luks-a2c66ca8-a44f-4b1a-8c4e-7eafef8112f3", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-a2c66ca8-a44f-4b1a-8c4e-7eafef8112f3", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:168 Monday 20 April 2026 15:51:38 -0400 (0:00:00.233) 0:06:42.680 ********** changed: [managed-node3] => (item={'src': 'UUID=e00121ad-995b-4c3e-9b3a-3774a767adfc', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=e00121ad-995b-4c3e-9b3a-3774a767adfc", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=e00121ad-995b-4c3e-9b3a-3774a767adfc" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:179 Monday 20 April 2026 15:51:40 -0400 (0:00:01.496) 0:06:44.176 ********** ok: [managed-node3] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:184 Monday 20 April 2026 15:51:41 -0400 (0:00:01.669) 0:06:45.846 ********** changed: [managed-node3] => (item={'src': '/dev/mapper/luks-a2c66ca8-a44f-4b1a-8c4e-7eafef8112f3', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-a2c66ca8-a44f-4b1a-8c4e-7eafef8112f3", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-a2c66ca8-a44f-4b1a-8c4e-7eafef8112f3" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Monday 20 April 2026 15:51:43 -0400 (0:00:01.624) 0:06:47.470 ********** skipping: [managed-node3] => (item={'src': '/dev/mapper/luks-a2c66ca8-a44f-4b1a-8c4e-7eafef8112f3', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-a2c66ca8-a44f-4b1a-8c4e-7eafef8112f3", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:207 Monday 20 April 2026 15:51:43 -0400 (0:00:00.340) 0:06:47.810 ********** ok: [managed-node3] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:215 Monday 20 April 2026 15:51:45 -0400 (0:00:02.017) 0:06:49.828 ********** ok: [managed-node3] => { "changed": false, "stat": { "atime": 1776714582.3688908, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1776714572.4609103, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 150995139, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1776714572.4599104, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "224210175", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:220 Monday 20 April 2026 15:51:47 -0400 (0:00:01.580) 0:06:51.408 ********** changed: [managed-node3] => (item={'backing_device': '/dev/sda', 'name': 'luks-a2c66ca8-a44f-4b1a-8c4e-7eafef8112f3', 'password': '-', 'state': 'present'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda", "name": "luks-a2c66ca8-a44f-4b1a-8c4e-7eafef8112f3", "password": "-", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:242 Monday 20 April 2026 15:51:49 -0400 (0:00:01.739) 0:06:53.148 ********** ok: [managed-node3] TASK [fedora.linux_system_roles.storage : Record role success fingerprint] ***** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:16 Monday 20 April 2026 15:51:50 -0400 (0:00:01.710) 0:06:54.858 ********** ok: [managed-node3] => { "changed": false } TASK [Verify role results - 3] ************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:186 Monday 20 April 2026 15:51:52 -0400 (0:00:01.730) 0:06:56.589 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node3 TASK [Print out pool information] ********************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Monday 20 April 2026 15:51:53 -0400 (0:00:00.485) 0:06:57.074 ********** skipping: [managed-node3] => {} TASK [Print out volume information] ******************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Monday 20 April 2026 15:51:53 -0400 (0:00:00.347) 0:06:57.421 ********** ok: [managed-node3] => { "_storage_volumes_list": [ { "_device": "/dev/mapper/luks-a2c66ca8-a44f-4b1a-8c4e-7eafef8112f3", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-a2c66ca8-a44f-4b1a-8c4e-7eafef8112f3", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Monday 20 April 2026 15:51:53 -0400 (0:00:00.258) 0:06:57.680 ********** ok: [managed-node3] => { "changed": false, "info": { "/dev/mapper/luks-a2c66ca8-a44f-4b1a-8c4e-7eafef8112f3": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-a2c66ca8-a44f-4b1a-8c4e-7eafef8112f3", "size": "10G", "type": "crypt", "uuid": "83ce77fb-bf71-4deb-af11-12d7ad2c14bd" }, "/dev/sda": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "a2c66ca8-a44f-4b1a-8c4e-7eafef8112f3" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Monday 20 April 2026 15:51:55 -0400 (0:00:01.757) 0:06:59.437 ********** ok: [managed-node3] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002752", "end": "2026-04-20 15:51:56.858491", "rc": 0, "start": "2026-04-20 15:51:56.855739" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-a2c66ca8-a44f-4b1a-8c4e-7eafef8112f3 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Monday 20 April 2026 15:51:57 -0400 (0:00:01.652) 0:07:01.090 ********** ok: [managed-node3] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002523", "end": "2026-04-20 15:51:58.421820", "failed_when_result": false, "rc": 0, "start": "2026-04-20 15:51:58.419297" } STDOUT: luks-a2c66ca8-a44f-4b1a-8c4e-7eafef8112f3 /dev/sda - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Monday 20 April 2026 15:51:58 -0400 (0:00:01.544) 0:07:02.635 ********** TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Monday 20 April 2026 15:51:58 -0400 (0:00:00.176) 0:07:02.811 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node3 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Monday 20 April 2026 15:51:59 -0400 (0:00:00.351) 0:07:03.162 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Monday 20 April 2026 15:51:59 -0400 (0:00:00.262) 0:07:03.425 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node3 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node3 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node3 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node3 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node3 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node3 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node3 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node3 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Monday 20 April 2026 15:52:00 -0400 (0:00:01.106) 0:07:04.531 ********** ok: [managed-node3] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-a2c66ca8-a44f-4b1a-8c4e-7eafef8112f3" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Monday 20 April 2026 15:52:00 -0400 (0:00:00.262) 0:07:04.794 ********** ok: [managed-node3] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Monday 20 April 2026 15:52:01 -0400 (0:00:00.332) 0:07:05.126 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Monday 20 April 2026 15:52:01 -0400 (0:00:00.330) 0:07:05.456 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Monday 20 April 2026 15:52:01 -0400 (0:00:00.232) 0:07:05.689 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Monday 20 April 2026 15:52:02 -0400 (0:00:00.369) 0:07:06.059 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Monday 20 April 2026 15:52:02 -0400 (0:00:00.267) 0:07:06.326 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Monday 20 April 2026 15:52:02 -0400 (0:00:00.230) 0:07:06.557 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Monday 20 April 2026 15:52:02 -0400 (0:00:00.320) 0:07:06.878 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Monday 20 April 2026 15:52:03 -0400 (0:00:00.262) 0:07:07.140 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Monday 20 April 2026 15:52:03 -0400 (0:00:00.310) 0:07:07.450 ********** ok: [managed-node3] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Monday 20 April 2026 15:52:03 -0400 (0:00:00.222) 0:07:07.673 ********** ok: [managed-node3] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-a2c66ca8-a44f-4b1a-8c4e-7eafef8112f3 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Monday 20 April 2026 15:52:04 -0400 (0:00:00.613) 0:07:08.286 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Monday 20 April 2026 15:52:04 -0400 (0:00:00.343) 0:07:08.630 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Monday 20 April 2026 15:52:04 -0400 (0:00:00.303) 0:07:08.933 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Monday 20 April 2026 15:52:05 -0400 (0:00:00.230) 0:07:09.164 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Monday 20 April 2026 15:52:05 -0400 (0:00:00.308) 0:07:09.472 ********** ok: [managed-node3] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Monday 20 April 2026 15:52:05 -0400 (0:00:00.252) 0:07:09.724 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Monday 20 April 2026 15:52:06 -0400 (0:00:00.385) 0:07:10.110 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Monday 20 April 2026 15:52:06 -0400 (0:00:00.406) 0:07:10.517 ********** ok: [managed-node3] => { "changed": false, "stat": { "atime": 1776714693.319671, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1776714693.319671, "dev": 6, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 38585, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1776714693.319671, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Monday 20 April 2026 15:52:07 -0400 (0:00:01.406) 0:07:11.924 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Monday 20 April 2026 15:52:08 -0400 (0:00:00.241) 0:07:12.165 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Monday 20 April 2026 15:52:08 -0400 (0:00:00.283) 0:07:12.448 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Monday 20 April 2026 15:52:08 -0400 (0:00:00.330) 0:07:12.779 ********** ok: [managed-node3] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Monday 20 April 2026 15:52:09 -0400 (0:00:00.642) 0:07:13.421 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Monday 20 April 2026 15:52:09 -0400 (0:00:00.302) 0:07:13.724 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Monday 20 April 2026 15:52:09 -0400 (0:00:00.149) 0:07:13.874 ********** ok: [managed-node3] => { "changed": false, "stat": { "atime": 1776714693.4586706, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1776714693.4586706, "dev": 6, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 217642, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1776714693.4586706, "nlink": 1, "path": "/dev/mapper/luks-a2c66ca8-a44f-4b1a-8c4e-7eafef8112f3", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Monday 20 April 2026 15:52:11 -0400 (0:00:01.345) 0:07:15.219 ********** ok: [managed-node3] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Monday 20 April 2026 15:52:15 -0400 (0:00:04.061) 0:07:19.281 ********** ok: [managed-node3] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/sda" ], "delta": "0:00:00.010116", "end": "2026-04-20 15:52:16.572722", "rc": 0, "start": "2026-04-20 15:52:16.562606" } STDOUT: LUKS header information Version: 2 Epoch: 3 Metadata area: 16384 [bytes] Keyslots area: 16744448 [bytes] UUID: a2c66ca8-a44f-4b1a-8c4e-7eafef8112f3 Label: (no label) Subsystem: (no subsystem) Flags: (no flags) Data segments: 0: crypt offset: 16777216 [bytes] length: (whole device) cipher: aes-xts-plain64 sector: 512 [bytes] Keyslots: 0: luks2 Key: 512 bits Priority: normal Cipher: aes-xts-plain64 Cipher key: 512 bits PBKDF: argon2i Time cost: 4 Memory: 937509 Threads: 2 Salt: 9e 86 b5 93 6c cf 88 81 7d 4c 5a 86 4b 0e c2 6d 3a f7 8d 40 d0 fc f0 d6 af 69 71 c1 a6 0c c2 6e AF stripes: 4000 AF hash: sha256 Area offset:32768 [bytes] Area length:258048 [bytes] Digest ID: 0 Tokens: Digests: 0: pbkdf2 Hash: sha256 Iterations: 118940 Salt: 45 b4 62 91 46 d4 da 17 59 ff cf 7f 22 80 e3 24 e5 d0 5f c2 3d 3b 7d 4a be 46 90 0d 37 32 e8 dd Digest: 24 cf 62 fd a4 cc 1c 09 d9 11 81 2d 8c 65 2f 1d 29 ff 27 d5 ea 0c 60 e9 07 8c 74 41 e5 b7 81 78 TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Monday 20 April 2026 15:52:16 -0400 (0:00:01.552) 0:07:20.834 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Monday 20 April 2026 15:52:17 -0400 (0:00:00.247) 0:07:21.081 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Monday 20 April 2026 15:52:17 -0400 (0:00:00.399) 0:07:21.481 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Monday 20 April 2026 15:52:17 -0400 (0:00:00.237) 0:07:21.719 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Monday 20 April 2026 15:52:18 -0400 (0:00:00.289) 0:07:22.008 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Monday 20 April 2026 15:52:18 -0400 (0:00:00.314) 0:07:22.322 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Monday 20 April 2026 15:52:18 -0400 (0:00:00.341) 0:07:22.664 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Monday 20 April 2026 15:52:18 -0400 (0:00:00.263) 0:07:22.927 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-a2c66ca8-a44f-4b1a-8c4e-7eafef8112f3 /dev/sda -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Monday 20 April 2026 15:52:19 -0400 (0:00:00.326) 0:07:23.254 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Monday 20 April 2026 15:52:19 -0400 (0:00:00.354) 0:07:23.609 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Monday 20 April 2026 15:52:19 -0400 (0:00:00.314) 0:07:23.923 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Monday 20 April 2026 15:52:20 -0400 (0:00:00.362) 0:07:24.286 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Monday 20 April 2026 15:52:20 -0400 (0:00:00.217) 0:07:24.503 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Monday 20 April 2026 15:52:20 -0400 (0:00:00.088) 0:07:24.592 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Monday 20 April 2026 15:52:20 -0400 (0:00:00.160) 0:07:24.752 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Monday 20 April 2026 15:52:20 -0400 (0:00:00.162) 0:07:24.915 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Monday 20 April 2026 15:52:21 -0400 (0:00:00.276) 0:07:25.191 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Monday 20 April 2026 15:52:21 -0400 (0:00:00.139) 0:07:25.331 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Monday 20 April 2026 15:52:21 -0400 (0:00:00.154) 0:07:25.485 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Monday 20 April 2026 15:52:21 -0400 (0:00:00.235) 0:07:25.721 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Monday 20 April 2026 15:52:21 -0400 (0:00:00.219) 0:07:25.940 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Monday 20 April 2026 15:52:22 -0400 (0:00:00.292) 0:07:26.232 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Monday 20 April 2026 15:52:22 -0400 (0:00:00.242) 0:07:26.475 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Monday 20 April 2026 15:52:22 -0400 (0:00:00.264) 0:07:26.740 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Monday 20 April 2026 15:52:22 -0400 (0:00:00.194) 0:07:26.935 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Monday 20 April 2026 15:52:23 -0400 (0:00:00.234) 0:07:27.170 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Monday 20 April 2026 15:52:23 -0400 (0:00:00.172) 0:07:27.342 ********** ok: [managed-node3] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Monday 20 April 2026 15:52:23 -0400 (0:00:00.211) 0:07:27.554 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Monday 20 April 2026 15:52:23 -0400 (0:00:00.288) 0:07:27.843 ********** skipping: [managed-node3] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Monday 20 April 2026 15:52:24 -0400 (0:00:00.218) 0:07:28.062 ********** skipping: [managed-node3] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Monday 20 April 2026 15:52:24 -0400 (0:00:00.292) 0:07:28.354 ********** skipping: [managed-node3] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Monday 20 April 2026 15:52:24 -0400 (0:00:00.261) 0:07:28.616 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Monday 20 April 2026 15:52:24 -0400 (0:00:00.285) 0:07:28.901 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Monday 20 April 2026 15:52:25 -0400 (0:00:00.235) 0:07:29.136 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Monday 20 April 2026 15:52:25 -0400 (0:00:00.263) 0:07:29.400 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Monday 20 April 2026 15:52:25 -0400 (0:00:00.198) 0:07:29.599 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Monday 20 April 2026 15:52:25 -0400 (0:00:00.294) 0:07:29.894 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Monday 20 April 2026 15:52:26 -0400 (0:00:00.275) 0:07:30.170 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Monday 20 April 2026 15:52:26 -0400 (0:00:00.309) 0:07:30.479 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Monday 20 April 2026 15:52:26 -0400 (0:00:00.296) 0:07:30.776 ********** skipping: [managed-node3] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Monday 20 April 2026 15:52:27 -0400 (0:00:00.251) 0:07:31.028 ********** skipping: [managed-node3] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Monday 20 April 2026 15:52:27 -0400 (0:00:00.256) 0:07:31.284 ********** skipping: [managed-node3] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Monday 20 April 2026 15:52:27 -0400 (0:00:00.282) 0:07:31.567 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Monday 20 April 2026 15:52:27 -0400 (0:00:00.323) 0:07:31.890 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Monday 20 April 2026 15:52:28 -0400 (0:00:00.257) 0:07:32.148 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Monday 20 April 2026 15:52:28 -0400 (0:00:00.255) 0:07:32.403 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Monday 20 April 2026 15:52:28 -0400 (0:00:00.222) 0:07:32.626 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Monday 20 April 2026 15:52:28 -0400 (0:00:00.238) 0:07:32.864 ********** ok: [managed-node3] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Monday 20 April 2026 15:52:28 -0400 (0:00:00.083) 0:07:32.948 ********** ok: [managed-node3] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Monday 20 April 2026 15:52:29 -0400 (0:00:00.157) 0:07:33.105 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Monday 20 April 2026 15:52:29 -0400 (0:00:00.258) 0:07:33.363 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Monday 20 April 2026 15:52:29 -0400 (0:00:00.248) 0:07:33.612 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Monday 20 April 2026 15:52:29 -0400 (0:00:00.285) 0:07:33.897 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Monday 20 April 2026 15:52:30 -0400 (0:00:00.214) 0:07:34.111 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Monday 20 April 2026 15:52:30 -0400 (0:00:00.239) 0:07:34.351 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Monday 20 April 2026 15:52:30 -0400 (0:00:00.231) 0:07:34.582 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Monday 20 April 2026 15:52:30 -0400 (0:00:00.222) 0:07:34.805 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Monday 20 April 2026 15:52:31 -0400 (0:00:00.199) 0:07:35.004 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Monday 20 April 2026 15:52:31 -0400 (0:00:00.147) 0:07:35.152 ********** ok: [managed-node3] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Test for correct handling of new encrypted volume w/ no key - 2] ********* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:193 Monday 20 April 2026 15:52:31 -0400 (0:00:00.266) 0:07:35.419 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node3 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Monday 20 April 2026 15:52:32 -0400 (0:00:00.626) 0:07:36.045 ********** ok: [managed-node3] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Monday 20 April 2026 15:52:32 -0400 (0:00:00.253) 0:07:36.299 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node3 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Monday 20 April 2026 15:52:32 -0400 (0:00:00.322) 0:07:36.621 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Monday 20 April 2026 15:52:32 -0400 (0:00:00.324) 0:07:36.946 ********** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Monday 20 April 2026 15:52:33 -0400 (0:00:00.245) 0:07:37.192 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Monday 20 April 2026 15:52:33 -0400 (0:00:00.225) 0:07:37.418 ********** ok: [managed-node3] TASK [fedora.linux_system_roles.storage : Record role begin fingerprint] ******* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Monday 20 April 2026 15:52:36 -0400 (0:00:02.754) 0:07:40.172 ********** ok: [managed-node3] => { "changed": false } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:13 Monday 20 April 2026 15:52:37 -0400 (0:00:01.492) 0:07:41.665 ********** skipping: [managed-node3] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node3] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node3] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node3] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:31 Monday 20 April 2026 15:52:38 -0400 (0:00:00.357) 0:07:42.022 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:36 Monday 20 April 2026 15:52:38 -0400 (0:00:00.285) 0:07:42.308 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Monday 20 April 2026 15:52:38 -0400 (0:00:00.194) 0:07:42.502 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Monday 20 April 2026 15:52:38 -0400 (0:00:00.217) 0:07:42.720 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Monday 20 April 2026 15:52:38 -0400 (0:00:00.218) 0:07:42.938 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Monday 20 April 2026 15:52:39 -0400 (0:00:00.419) 0:07:43.358 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Monday 20 April 2026 15:52:39 -0400 (0:00:00.201) 0:07:43.560 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Monday 20 April 2026 15:52:39 -0400 (0:00:00.217) 0:07:43.777 ********** ok: [managed-node3] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Monday 20 April 2026 15:52:44 -0400 (0:00:04.235) 0:07:48.012 ********** ok: [managed-node3] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": true, "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Monday 20 April 2026 15:52:44 -0400 (0:00:00.220) 0:07:48.233 ********** ok: [managed-node3] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Monday 20 April 2026 15:52:44 -0400 (0:00:00.115) 0:07:48.349 ********** ok: [managed-node3] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Monday 20 April 2026 15:52:49 -0400 (0:00:05.074) 0:07:53.423 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Monday 20 April 2026 15:52:49 -0400 (0:00:00.226) 0:07:53.650 ********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Monday 20 April 2026 15:52:49 -0400 (0:00:00.178) 0:07:53.829 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Monday 20 April 2026 15:52:50 -0400 (0:00:00.175) 0:07:54.004 ********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Monday 20 April 2026 15:52:50 -0400 (0:00:00.090) 0:07:54.095 ********** ok: [managed-node3] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Monday 20 April 2026 15:52:54 -0400 (0:00:03.935) 0:07:58.031 ********** ok: [managed-node3] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Monday 20 April 2026 15:52:56 -0400 (0:00:02.514) 0:08:00.546 ********** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Monday 20 April 2026 15:52:56 -0400 (0:00:00.231) 0:08:00.777 ********** fatal: [managed-node3]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: encrypted volume 'test1' missing key/password TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129 Monday 20 April 2026 15:53:02 -0400 (0:00:05.519) 0:08:06.297 ********** fatal: [managed-node3]: FAILED! => { "changed": false } MSG: {'msg': "encrypted volume 'test1' missing key/password", 'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'invocation': {'module_args': {'pools': [{'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'partition', 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None}]}], 'volumes': [], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': False, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Monday 20 April 2026 15:53:02 -0400 (0:00:00.244) 0:08:06.541 ********** TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:22 Monday 20 April 2026 15:53:02 -0400 (0:00:00.370) 0:08:06.911 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:27 Monday 20 April 2026 15:53:03 -0400 (0:00:00.198) 0:08:07.110 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:38 Monday 20 April 2026 15:53:03 -0400 (0:00:00.326) 0:08:07.436 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Create an encrypted partition volume w/ default fs] ********************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:212 Monday 20 April 2026 15:53:03 -0400 (0:00:00.313) 0:08:07.749 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node3 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Monday 20 April 2026 15:53:04 -0400 (0:00:00.653) 0:08:08.403 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Monday 20 April 2026 15:53:04 -0400 (0:00:00.184) 0:08:08.588 ********** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Monday 20 April 2026 15:53:04 -0400 (0:00:00.161) 0:08:08.749 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Monday 20 April 2026 15:53:04 -0400 (0:00:00.179) 0:08:08.929 ********** ok: [managed-node3] TASK [fedora.linux_system_roles.storage : Record role begin fingerprint] ******* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Monday 20 April 2026 15:53:07 -0400 (0:00:02.117) 0:08:11.046 ********** ok: [managed-node3] => { "changed": false } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:13 Monday 20 April 2026 15:53:08 -0400 (0:00:01.457) 0:08:12.503 ********** skipping: [managed-node3] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node3] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node3] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node3] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:31 Monday 20 April 2026 15:53:08 -0400 (0:00:00.381) 0:08:12.884 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:36 Monday 20 April 2026 15:53:09 -0400 (0:00:00.326) 0:08:13.211 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Monday 20 April 2026 15:53:09 -0400 (0:00:00.100) 0:08:13.311 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Monday 20 April 2026 15:53:09 -0400 (0:00:00.117) 0:08:13.428 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Monday 20 April 2026 15:53:09 -0400 (0:00:00.108) 0:08:13.537 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Monday 20 April 2026 15:53:10 -0400 (0:00:00.414) 0:08:13.951 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Monday 20 April 2026 15:53:10 -0400 (0:00:00.157) 0:08:14.108 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Monday 20 April 2026 15:53:10 -0400 (0:00:00.329) 0:08:14.437 ********** ok: [managed-node3] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Monday 20 April 2026 15:53:14 -0400 (0:00:04.261) 0:08:18.699 ********** ok: [managed-node3] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Monday 20 April 2026 15:53:14 -0400 (0:00:00.241) 0:08:18.941 ********** ok: [managed-node3] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Monday 20 April 2026 15:53:15 -0400 (0:00:00.341) 0:08:19.282 ********** ok: [managed-node3] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Monday 20 April 2026 15:53:20 -0400 (0:00:05.372) 0:08:24.654 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Monday 20 April 2026 15:53:21 -0400 (0:00:00.301) 0:08:24.956 ********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Monday 20 April 2026 15:53:21 -0400 (0:00:00.150) 0:08:25.107 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Monday 20 April 2026 15:53:21 -0400 (0:00:00.143) 0:08:25.250 ********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Monday 20 April 2026 15:53:21 -0400 (0:00:00.055) 0:08:25.306 ********** ok: [managed-node3] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Monday 20 April 2026 15:53:25 -0400 (0:00:04.175) 0:08:29.482 ********** ok: [managed-node3] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Monday 20 April 2026 15:53:28 -0400 (0:00:02.645) 0:08:32.128 ********** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Monday 20 April 2026 15:53:28 -0400 (0:00:00.384) 0:08:32.512 ********** changed: [managed-node3] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-a2c66ca8-a44f-4b1a-8c4e-7eafef8112f3", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-a2c66ca8-a44f-4b1a-8c4e-7eafef8112f3", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sda1", "fs_type": null }, { "action": "create format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-36889b12-f60f-49c7-ac51-807f73989f4a", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-36889b12-f60f-49c7-ac51-807f73989f4a", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-a2c66ca8-a44f-4b1a-8c4e-7eafef8112f3", "password": "-", "state": "absent" }, { "backing_device": "/dev/sda1", "name": "luks-36889b12-f60f-49c7-ac51-807f73989f4a", "password": "-", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-36889b12-f60f-49c7-ac51-807f73989f4a" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-a2c66ca8-a44f-4b1a-8c4e-7eafef8112f3", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-36889b12-f60f-49c7-ac51-807f73989f4a", "state": "mounted" } ], "packages": [ "xfsprogs", "cryptsetup" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-36889b12-f60f-49c7-ac51-807f73989f4a", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-36889b12-f60f-49c7-ac51-807f73989f4a", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:103 Monday 20 April 2026 15:53:42 -0400 (0:00:13.972) 0:08:46.485 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:110 Monday 20 April 2026 15:53:42 -0400 (0:00:00.276) 0:08:46.761 ********** ok: [managed-node3] => { "changed": false, "stat": { "atime": 1776714703.322651, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "f4c93a47c3ea2add8c004998a865599f0ee4f4dc", "ctime": 1776714703.3196511, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 73400516, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1776714703.3196511, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1425, "uid": 0, "version": "319406980", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Monday 20 April 2026 15:53:44 -0400 (0:00:01.632) 0:08:48.394 ********** ok: [managed-node3] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Monday 20 April 2026 15:53:46 -0400 (0:00:01.668) 0:08:50.063 ********** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:139 Monday 20 April 2026 15:53:46 -0400 (0:00:00.389) 0:08:50.452 ********** ok: [managed-node3] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-a2c66ca8-a44f-4b1a-8c4e-7eafef8112f3", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-a2c66ca8-a44f-4b1a-8c4e-7eafef8112f3", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sda1", "fs_type": null }, { "action": "create format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-36889b12-f60f-49c7-ac51-807f73989f4a", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-36889b12-f60f-49c7-ac51-807f73989f4a", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-a2c66ca8-a44f-4b1a-8c4e-7eafef8112f3", "password": "-", "state": "absent" }, { "backing_device": "/dev/sda1", "name": "luks-36889b12-f60f-49c7-ac51-807f73989f4a", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-36889b12-f60f-49c7-ac51-807f73989f4a" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-a2c66ca8-a44f-4b1a-8c4e-7eafef8112f3", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-36889b12-f60f-49c7-ac51-807f73989f4a", "state": "mounted" } ], "packages": [ "xfsprogs", "cryptsetup" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-36889b12-f60f-49c7-ac51-807f73989f4a", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-36889b12-f60f-49c7-ac51-807f73989f4a", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Monday 20 April 2026 15:53:46 -0400 (0:00:00.326) 0:08:50.779 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-36889b12-f60f-49c7-ac51-807f73989f4a", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-36889b12-f60f-49c7-ac51-807f73989f4a", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:152 Monday 20 April 2026 15:53:47 -0400 (0:00:00.197) 0:08:50.976 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:168 Monday 20 April 2026 15:53:47 -0400 (0:00:00.164) 0:08:51.140 ********** changed: [managed-node3] => (item={'src': '/dev/mapper/luks-a2c66ca8-a44f-4b1a-8c4e-7eafef8112f3', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-a2c66ca8-a44f-4b1a-8c4e-7eafef8112f3", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-a2c66ca8-a44f-4b1a-8c4e-7eafef8112f3" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:179 Monday 20 April 2026 15:53:48 -0400 (0:00:01.436) 0:08:52.576 ********** ok: [managed-node3] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:184 Monday 20 April 2026 15:53:50 -0400 (0:00:01.826) 0:08:54.403 ********** changed: [managed-node3] => (item={'src': '/dev/mapper/luks-36889b12-f60f-49c7-ac51-807f73989f4a', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-36889b12-f60f-49c7-ac51-807f73989f4a", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-36889b12-f60f-49c7-ac51-807f73989f4a" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Monday 20 April 2026 15:53:51 -0400 (0:00:01.383) 0:08:55.787 ********** skipping: [managed-node3] => (item={'src': '/dev/mapper/luks-36889b12-f60f-49c7-ac51-807f73989f4a', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-36889b12-f60f-49c7-ac51-807f73989f4a", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:207 Monday 20 April 2026 15:53:52 -0400 (0:00:00.406) 0:08:56.194 ********** ok: [managed-node3] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:215 Monday 20 April 2026 15:53:53 -0400 (0:00:01.584) 0:08:57.778 ********** ok: [managed-node3] => { "changed": false, "stat": { "atime": 1776714718.4206212, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "16a6305a7138c4c565e8e4c017dec8d7a8b14cc8", "ctime": 1776714708.9596398, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 299893005, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1776714708.95764, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 53, "uid": 0, "version": "3312237637", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:220 Monday 20 April 2026 15:53:55 -0400 (0:00:01.578) 0:08:59.357 ********** changed: [managed-node3] => (item={'backing_device': '/dev/sda', 'name': 'luks-a2c66ca8-a44f-4b1a-8c4e-7eafef8112f3', 'password': '-', 'state': 'absent'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda", "name": "luks-a2c66ca8-a44f-4b1a-8c4e-7eafef8112f3", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed changed: [managed-node3] => (item={'backing_device': '/dev/sda1', 'name': 'luks-36889b12-f60f-49c7-ac51-807f73989f4a', 'password': '-', 'state': 'present'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda1", "name": "luks-36889b12-f60f-49c7-ac51-807f73989f4a", "password": "-", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:242 Monday 20 April 2026 15:53:58 -0400 (0:00:03.298) 0:09:02.655 ********** ok: [managed-node3] TASK [fedora.linux_system_roles.storage : Record role success fingerprint] ***** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:16 Monday 20 April 2026 15:54:00 -0400 (0:00:01.827) 0:09:04.483 ********** ok: [managed-node3] => { "changed": false } TASK [Verify role results - 4] ************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:228 Monday 20 April 2026 15:54:01 -0400 (0:00:01.451) 0:09:05.934 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node3 TASK [Print out pool information] ********************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Monday 20 April 2026 15:54:03 -0400 (0:00:01.237) 0:09:07.172 ********** ok: [managed-node3] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-36889b12-f60f-49c7-ac51-807f73989f4a", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-36889b12-f60f-49c7-ac51-807f73989f4a", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Monday 20 April 2026 15:54:03 -0400 (0:00:00.327) 0:09:07.500 ********** skipping: [managed-node3] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Monday 20 April 2026 15:54:03 -0400 (0:00:00.233) 0:09:07.734 ********** ok: [managed-node3] => { "changed": false, "info": { "/dev/mapper/luks-36889b12-f60f-49c7-ac51-807f73989f4a": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-36889b12-f60f-49c7-ac51-807f73989f4a", "size": "4G", "type": "crypt", "uuid": "6b0dd9a2-3248-46dc-b21f-c848b1c02746" }, "/dev/sda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/sda1", "size": "4G", "type": "partition", "uuid": "36889b12-f60f-49c7-ac51-807f73989f4a" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Monday 20 April 2026 15:54:05 -0400 (0:00:01.528) 0:09:09.263 ********** ok: [managed-node3] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002646", "end": "2026-04-20 15:54:06.550585", "rc": 0, "start": "2026-04-20 15:54:06.547939" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-36889b12-f60f-49c7-ac51-807f73989f4a /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Monday 20 April 2026 15:54:06 -0400 (0:00:01.483) 0:09:10.747 ********** ok: [managed-node3] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002653", "end": "2026-04-20 15:54:08.082049", "failed_when_result": false, "rc": 0, "start": "2026-04-20 15:54:08.079396" } STDOUT: luks-36889b12-f60f-49c7-ac51-807f73989f4a /dev/sda1 - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Monday 20 April 2026 15:54:08 -0400 (0:00:01.531) 0:09:12.278 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node3 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Monday 20 April 2026 15:54:08 -0400 (0:00:00.362) 0:09:12.640 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Monday 20 April 2026 15:54:08 -0400 (0:00:00.120) 0:09:12.761 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Monday 20 April 2026 15:54:09 -0400 (0:00:00.201) 0:09:12.962 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Monday 20 April 2026 15:54:09 -0400 (0:00:00.155) 0:09:13.118 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node3 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node3 TASK [Set test variables] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Monday 20 April 2026 15:54:09 -0400 (0:00:00.267) 0:09:13.386 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Monday 20 April 2026 15:54:09 -0400 (0:00:00.196) 0:09:13.583 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Monday 20 April 2026 15:54:09 -0400 (0:00:00.167) 0:09:13.750 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Monday 20 April 2026 15:54:09 -0400 (0:00:00.116) 0:09:13.867 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Monday 20 April 2026 15:54:10 -0400 (0:00:00.317) 0:09:14.185 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Monday 20 April 2026 15:54:10 -0400 (0:00:00.205) 0:09:14.390 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type - 2] ************************************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Monday 20 April 2026 15:54:10 -0400 (0:00:00.238) 0:09:14.629 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type - 3] ************************************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Monday 20 April 2026 15:54:10 -0400 (0:00:00.228) 0:09:14.857 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:55 Monday 20 April 2026 15:54:11 -0400 (0:00:00.265) 0:09:15.123 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:68 Monday 20 April 2026 15:54:11 -0400 (0:00:00.292) 0:09:15.416 ********** ok: [managed-node3] => { "changed": false, "failed_when_result": false, "rc": 1 } STDERR: Shared connection to 10.31.13.113 closed. MSG: non-zero return code TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:78 Monday 20 April 2026 15:54:13 -0400 (0:00:01.685) 0:09:17.102 ********** TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:88 Monday 20 April 2026 15:54:13 -0400 (0:00:00.240) 0:09:17.343 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node3 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Monday 20 April 2026 15:54:13 -0400 (0:00:00.497) 0:09:17.840 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Monday 20 April 2026 15:54:14 -0400 (0:00:00.279) 0:09:18.119 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Monday 20 April 2026 15:54:14 -0400 (0:00:00.377) 0:09:18.497 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Monday 20 April 2026 15:54:14 -0400 (0:00:00.281) 0:09:18.778 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Monday 20 April 2026 15:54:15 -0400 (0:00:00.314) 0:09:19.093 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Monday 20 April 2026 15:54:15 -0400 (0:00:00.293) 0:09:19.386 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Monday 20 April 2026 15:54:15 -0400 (0:00:00.379) 0:09:19.766 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Monday 20 April 2026 15:54:16 -0400 (0:00:00.354) 0:09:20.120 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Monday 20 April 2026 15:54:16 -0400 (0:00:00.270) 0:09:20.390 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Monday 20 April 2026 15:54:16 -0400 (0:00:00.319) 0:09:20.709 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Monday 20 April 2026 15:54:16 -0400 (0:00:00.147) 0:09:20.857 ********** ok: [managed-node3] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:91 Monday 20 April 2026 15:54:17 -0400 (0:00:00.231) 0:09:21.089 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node3 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Monday 20 April 2026 15:54:17 -0400 (0:00:00.506) 0:09:21.596 ********** skipping: [managed-node3] => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-36889b12-f60f-49c7-ac51-807f73989f4a', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-36889b12-f60f-49c7-ac51-807f73989f4a', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_lvmraid_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_lvmraid_volume": { "_device": "/dev/mapper/luks-36889b12-f60f-49c7-ac51-807f73989f4a", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-36889b12-f60f-49c7-ac51-807f73989f4a", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:94 Monday 20 April 2026 15:54:17 -0400 (0:00:00.278) 0:09:21.874 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node3 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Monday 20 April 2026 15:54:18 -0400 (0:00:00.356) 0:09:22.230 ********** skipping: [managed-node3] => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-36889b12-f60f-49c7-ac51-807f73989f4a', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-36889b12-f60f-49c7-ac51-807f73989f4a', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_thin_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_thin_volume": { "_device": "/dev/mapper/luks-36889b12-f60f-49c7-ac51-807f73989f4a", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-36889b12-f60f-49c7-ac51-807f73989f4a", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check member encryption] ************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:97 Monday 20 April 2026 15:54:18 -0400 (0:00:00.369) 0:09:22.600 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node3 TASK [Set test variables] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Monday 20 April 2026 15:54:19 -0400 (0:00:00.631) 0:09:23.231 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Monday 20 April 2026 15:54:19 -0400 (0:00:00.362) 0:09:23.594 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Monday 20 April 2026 15:54:19 -0400 (0:00:00.258) 0:09:23.852 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Monday 20 April 2026 15:54:20 -0400 (0:00:00.406) 0:09:24.258 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:100 Monday 20 April 2026 15:54:20 -0400 (0:00:00.220) 0:09:24.478 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node3 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Monday 20 April 2026 15:54:21 -0400 (0:00:00.499) 0:09:24.978 ********** skipping: [managed-node3] => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-36889b12-f60f-49c7-ac51-807f73989f4a', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-36889b12-f60f-49c7-ac51-807f73989f4a', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_vdo_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_vdo_volume": { "_device": "/dev/mapper/luks-36889b12-f60f-49c7-ac51-807f73989f4a", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-36889b12-f60f-49c7-ac51-807f73989f4a", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:103 Monday 20 April 2026 15:54:21 -0400 (0:00:00.286) 0:09:25.264 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node3 TASK [Get stratis pool information] ******************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Monday 20 April 2026 15:54:22 -0400 (0:00:00.735) 0:09:25.999 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print script output] ***************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Monday 20 April 2026 15:54:22 -0400 (0:00:00.310) 0:09:26.309 ********** skipping: [managed-node3] => {} TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:19 Monday 20 April 2026 15:54:22 -0400 (0:00:00.333) 0:09:26.643 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:23 Monday 20 April 2026 15:54:23 -0400 (0:00:00.406) 0:09:27.050 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:30 Monday 20 April 2026 15:54:23 -0400 (0:00:00.228) 0:09:27.278 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:39 Monday 20 April 2026 15:54:23 -0400 (0:00:00.194) 0:09:27.472 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:49 Monday 20 April 2026 15:54:23 -0400 (0:00:00.376) 0:09:27.848 ********** ok: [managed-node3] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:106 Monday 20 April 2026 15:54:24 -0400 (0:00:00.262) 0:09:28.111 ********** ok: [managed-node3] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Monday 20 April 2026 15:54:24 -0400 (0:00:00.331) 0:09:28.443 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node3 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Monday 20 April 2026 15:54:25 -0400 (0:00:00.518) 0:09:28.961 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Monday 20 April 2026 15:54:25 -0400 (0:00:00.305) 0:09:29.267 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node3 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node3 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node3 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node3 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node3 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node3 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node3 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node3 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Monday 20 April 2026 15:54:27 -0400 (0:00:02.347) 0:09:31.615 ********** ok: [managed-node3] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-36889b12-f60f-49c7-ac51-807f73989f4a" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Monday 20 April 2026 15:54:28 -0400 (0:00:00.409) 0:09:32.025 ********** ok: [managed-node3] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Monday 20 April 2026 15:54:28 -0400 (0:00:00.333) 0:09:32.358 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Monday 20 April 2026 15:54:28 -0400 (0:00:00.429) 0:09:32.787 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Monday 20 April 2026 15:54:29 -0400 (0:00:00.283) 0:09:33.071 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Monday 20 April 2026 15:54:29 -0400 (0:00:00.260) 0:09:33.332 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Monday 20 April 2026 15:54:29 -0400 (0:00:00.270) 0:09:33.602 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Monday 20 April 2026 15:54:29 -0400 (0:00:00.203) 0:09:33.806 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Monday 20 April 2026 15:54:30 -0400 (0:00:00.227) 0:09:34.034 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Monday 20 April 2026 15:54:30 -0400 (0:00:00.242) 0:09:34.277 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Monday 20 April 2026 15:54:30 -0400 (0:00:00.141) 0:09:34.418 ********** ok: [managed-node3] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Monday 20 April 2026 15:54:30 -0400 (0:00:00.113) 0:09:34.531 ********** ok: [managed-node3] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-36889b12-f60f-49c7-ac51-807f73989f4a " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Monday 20 April 2026 15:54:31 -0400 (0:00:00.451) 0:09:34.983 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Monday 20 April 2026 15:54:31 -0400 (0:00:00.196) 0:09:35.180 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Monday 20 April 2026 15:54:31 -0400 (0:00:00.271) 0:09:35.451 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Monday 20 April 2026 15:54:31 -0400 (0:00:00.237) 0:09:35.689 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Monday 20 April 2026 15:54:32 -0400 (0:00:00.300) 0:09:35.989 ********** ok: [managed-node3] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Monday 20 April 2026 15:54:32 -0400 (0:00:00.140) 0:09:36.129 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Monday 20 April 2026 15:54:32 -0400 (0:00:00.310) 0:09:36.440 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Monday 20 April 2026 15:54:32 -0400 (0:00:00.354) 0:09:36.794 ********** ok: [managed-node3] => { "changed": false, "stat": { "atime": 1776714822.0754156, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1776714822.0754156, "dev": 6, "device_type": 2049, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 231997, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1776714822.0754156, "nlink": 1, "path": "/dev/sda1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Monday 20 April 2026 15:54:34 -0400 (0:00:01.718) 0:09:38.513 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Monday 20 April 2026 15:54:34 -0400 (0:00:00.365) 0:09:38.879 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Monday 20 April 2026 15:54:35 -0400 (0:00:00.333) 0:09:39.213 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Monday 20 April 2026 15:54:35 -0400 (0:00:00.339) 0:09:39.552 ********** ok: [managed-node3] => { "ansible_facts": { "st_volume_type": "partition" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Monday 20 April 2026 15:54:35 -0400 (0:00:00.334) 0:09:39.886 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Monday 20 April 2026 15:54:36 -0400 (0:00:00.418) 0:09:40.305 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Monday 20 April 2026 15:54:36 -0400 (0:00:00.378) 0:09:40.684 ********** ok: [managed-node3] => { "changed": false, "stat": { "atime": 1776714822.2204154, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1776714822.2204154, "dev": 6, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 232221, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1776714822.2204154, "nlink": 1, "path": "/dev/mapper/luks-36889b12-f60f-49c7-ac51-807f73989f4a", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Monday 20 April 2026 15:54:38 -0400 (0:00:01.636) 0:09:42.320 ********** ok: [managed-node3] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Monday 20 April 2026 15:54:42 -0400 (0:00:04.368) 0:09:46.689 ********** ok: [managed-node3] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/sda1" ], "delta": "0:00:00.009805", "end": "2026-04-20 15:54:43.813762", "rc": 0, "start": "2026-04-20 15:54:43.803957" } STDOUT: LUKS header information Version: 2 Epoch: 3 Metadata area: 16384 [bytes] Keyslots area: 16744448 [bytes] UUID: 36889b12-f60f-49c7-ac51-807f73989f4a Label: (no label) Subsystem: (no subsystem) Flags: (no flags) Data segments: 0: crypt offset: 16777216 [bytes] length: (whole device) cipher: aes-xts-plain64 sector: 512 [bytes] Keyslots: 0: luks2 Key: 512 bits Priority: normal Cipher: aes-xts-plain64 Cipher key: 512 bits PBKDF: argon2i Time cost: 4 Memory: 930270 Threads: 2 Salt: b8 ef 43 00 00 44 9a 1c 9e ef 73 c0 5c e9 99 62 9a ea 84 a4 d2 9b 6f 9a c0 ce f8 39 2e 80 dc 0c AF stripes: 4000 AF hash: sha256 Area offset:32768 [bytes] Area length:258048 [bytes] Digest ID: 0 Tokens: Digests: 0: pbkdf2 Hash: sha256 Iterations: 120470 Salt: 43 51 0f d9 2d 64 38 9f 11 10 76 ec f2 95 3c 15 a1 2e 1e c9 01 12 48 8a bc 5d b2 00 b4 84 ac 53 Digest: f3 85 a8 a6 c1 07 cc 66 e1 ce 73 a9 e1 32 db b7 38 2e 12 19 a3 cb 86 a4 03 56 5e 7e 7c 03 3d 2c TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Monday 20 April 2026 15:54:44 -0400 (0:00:01.413) 0:09:48.103 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Monday 20 April 2026 15:54:44 -0400 (0:00:00.407) 0:09:48.511 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Monday 20 April 2026 15:54:44 -0400 (0:00:00.230) 0:09:48.741 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Monday 20 April 2026 15:54:45 -0400 (0:00:00.304) 0:09:49.045 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Monday 20 April 2026 15:54:45 -0400 (0:00:00.328) 0:09:49.373 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Monday 20 April 2026 15:54:45 -0400 (0:00:00.342) 0:09:49.716 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Monday 20 April 2026 15:54:46 -0400 (0:00:00.290) 0:09:50.007 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Monday 20 April 2026 15:54:46 -0400 (0:00:00.295) 0:09:50.303 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-36889b12-f60f-49c7-ac51-807f73989f4a /dev/sda1 -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Monday 20 April 2026 15:54:46 -0400 (0:00:00.379) 0:09:50.683 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Monday 20 April 2026 15:54:47 -0400 (0:00:00.336) 0:09:51.020 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Monday 20 April 2026 15:54:47 -0400 (0:00:00.393) 0:09:51.413 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Monday 20 April 2026 15:54:47 -0400 (0:00:00.366) 0:09:51.779 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Monday 20 April 2026 15:54:48 -0400 (0:00:00.344) 0:09:52.124 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Monday 20 April 2026 15:54:48 -0400 (0:00:00.125) 0:09:52.249 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Monday 20 April 2026 15:54:48 -0400 (0:00:00.246) 0:09:52.496 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Monday 20 April 2026 15:54:48 -0400 (0:00:00.407) 0:09:52.903 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Monday 20 April 2026 15:54:49 -0400 (0:00:00.269) 0:09:53.172 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Monday 20 April 2026 15:54:49 -0400 (0:00:00.289) 0:09:53.462 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Monday 20 April 2026 15:54:49 -0400 (0:00:00.263) 0:09:53.726 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Monday 20 April 2026 15:54:50 -0400 (0:00:00.312) 0:09:54.038 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Monday 20 April 2026 15:54:50 -0400 (0:00:00.233) 0:09:54.272 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Monday 20 April 2026 15:54:50 -0400 (0:00:00.279) 0:09:54.552 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Monday 20 April 2026 15:54:50 -0400 (0:00:00.298) 0:09:54.850 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Monday 20 April 2026 15:54:51 -0400 (0:00:00.378) 0:09:55.228 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Monday 20 April 2026 15:54:51 -0400 (0:00:00.336) 0:09:55.564 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Monday 20 April 2026 15:54:51 -0400 (0:00:00.337) 0:09:55.902 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Monday 20 April 2026 15:54:52 -0400 (0:00:00.293) 0:09:56.195 ********** ok: [managed-node3] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Monday 20 April 2026 15:54:52 -0400 (0:00:00.309) 0:09:56.505 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Monday 20 April 2026 15:54:52 -0400 (0:00:00.275) 0:09:56.781 ********** skipping: [managed-node3] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Monday 20 April 2026 15:54:53 -0400 (0:00:00.254) 0:09:57.035 ********** skipping: [managed-node3] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Monday 20 April 2026 15:54:53 -0400 (0:00:00.258) 0:09:57.293 ********** skipping: [managed-node3] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Monday 20 April 2026 15:54:53 -0400 (0:00:00.326) 0:09:57.620 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Monday 20 April 2026 15:54:53 -0400 (0:00:00.275) 0:09:57.895 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Monday 20 April 2026 15:54:54 -0400 (0:00:00.242) 0:09:58.138 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Monday 20 April 2026 15:54:54 -0400 (0:00:00.323) 0:09:58.462 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Monday 20 April 2026 15:54:54 -0400 (0:00:00.308) 0:09:58.770 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Monday 20 April 2026 15:54:55 -0400 (0:00:00.332) 0:09:59.103 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Monday 20 April 2026 15:54:55 -0400 (0:00:00.254) 0:09:59.357 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Monday 20 April 2026 15:54:55 -0400 (0:00:00.339) 0:09:59.696 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Monday 20 April 2026 15:54:56 -0400 (0:00:00.304) 0:10:00.001 ********** skipping: [managed-node3] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Monday 20 April 2026 15:54:56 -0400 (0:00:00.332) 0:10:00.334 ********** skipping: [managed-node3] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Monday 20 April 2026 15:54:56 -0400 (0:00:00.240) 0:10:00.574 ********** skipping: [managed-node3] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Monday 20 April 2026 15:54:56 -0400 (0:00:00.161) 0:10:00.736 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Monday 20 April 2026 15:54:57 -0400 (0:00:00.260) 0:10:00.996 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Monday 20 April 2026 15:54:57 -0400 (0:00:00.273) 0:10:01.270 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Monday 20 April 2026 15:54:57 -0400 (0:00:00.303) 0:10:01.574 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Monday 20 April 2026 15:54:57 -0400 (0:00:00.259) 0:10:01.833 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Monday 20 April 2026 15:54:58 -0400 (0:00:00.427) 0:10:02.260 ********** ok: [managed-node3] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Monday 20 April 2026 15:54:58 -0400 (0:00:00.313) 0:10:02.574 ********** ok: [managed-node3] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Monday 20 April 2026 15:54:58 -0400 (0:00:00.244) 0:10:02.819 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Monday 20 April 2026 15:54:59 -0400 (0:00:00.320) 0:10:03.140 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Monday 20 April 2026 15:54:59 -0400 (0:00:00.294) 0:10:03.435 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Monday 20 April 2026 15:54:59 -0400 (0:00:00.318) 0:10:03.753 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Monday 20 April 2026 15:55:00 -0400 (0:00:00.280) 0:10:04.033 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Monday 20 April 2026 15:55:00 -0400 (0:00:00.283) 0:10:04.316 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Monday 20 April 2026 15:55:00 -0400 (0:00:00.295) 0:10:04.612 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Monday 20 April 2026 15:55:00 -0400 (0:00:00.276) 0:10:04.888 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Monday 20 April 2026 15:55:01 -0400 (0:00:00.244) 0:10:05.133 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Monday 20 April 2026 15:55:01 -0400 (0:00:00.296) 0:10:05.430 ********** TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Monday 20 April 2026 15:55:01 -0400 (0:00:00.265) 0:10:05.695 ********** ok: [managed-node3] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Monday 20 April 2026 15:55:01 -0400 (0:00:00.166) 0:10:05.862 ********** changed: [managed-node3] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode - 3] ****************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:234 Monday 20 April 2026 15:55:03 -0400 (0:00:01.317) 0:10:07.180 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node3 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Monday 20 April 2026 15:55:03 -0400 (0:00:00.630) 0:10:07.810 ********** ok: [managed-node3] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Monday 20 April 2026 15:55:04 -0400 (0:00:00.245) 0:10:08.073 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node3 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Monday 20 April 2026 15:55:04 -0400 (0:00:00.670) 0:10:08.743 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Monday 20 April 2026 15:55:04 -0400 (0:00:00.125) 0:10:08.869 ********** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Monday 20 April 2026 15:55:05 -0400 (0:00:00.248) 0:10:09.118 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Monday 20 April 2026 15:55:05 -0400 (0:00:00.167) 0:10:09.286 ********** ok: [managed-node3] TASK [fedora.linux_system_roles.storage : Record role begin fingerprint] ******* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Monday 20 April 2026 15:55:07 -0400 (0:00:02.194) 0:10:11.481 ********** ok: [managed-node3] => { "changed": false } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:13 Monday 20 April 2026 15:55:09 -0400 (0:00:01.762) 0:10:13.243 ********** skipping: [managed-node3] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node3] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node3] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node3] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:31 Monday 20 April 2026 15:55:09 -0400 (0:00:00.490) 0:10:13.734 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:36 Monday 20 April 2026 15:55:10 -0400 (0:00:00.348) 0:10:14.082 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Monday 20 April 2026 15:55:10 -0400 (0:00:00.229) 0:10:14.312 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Monday 20 April 2026 15:55:10 -0400 (0:00:00.188) 0:10:14.500 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Monday 20 April 2026 15:55:10 -0400 (0:00:00.206) 0:10:14.706 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Monday 20 April 2026 15:55:11 -0400 (0:00:00.629) 0:10:15.336 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Monday 20 April 2026 15:55:11 -0400 (0:00:00.180) 0:10:15.516 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Monday 20 April 2026 15:55:11 -0400 (0:00:00.103) 0:10:15.620 ********** ok: [managed-node3] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Monday 20 April 2026 15:55:16 -0400 (0:00:04.603) 0:10:20.224 ********** ok: [managed-node3] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": false, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Monday 20 April 2026 15:55:16 -0400 (0:00:00.318) 0:10:20.543 ********** ok: [managed-node3] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Monday 20 April 2026 15:55:16 -0400 (0:00:00.254) 0:10:20.798 ********** ok: [managed-node3] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Monday 20 April 2026 15:55:22 -0400 (0:00:05.568) 0:10:26.366 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Monday 20 April 2026 15:55:22 -0400 (0:00:00.273) 0:10:26.640 ********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Monday 20 April 2026 15:55:22 -0400 (0:00:00.091) 0:10:26.732 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Monday 20 April 2026 15:55:22 -0400 (0:00:00.186) 0:10:26.919 ********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Monday 20 April 2026 15:55:23 -0400 (0:00:00.182) 0:10:27.101 ********** ok: [managed-node3] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Monday 20 April 2026 15:55:27 -0400 (0:00:04.467) 0:10:31.568 ********** ok: [managed-node3] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luk...da44f\\x2d4b1a\\x2d8c4e\\x2d7eafef8112f3.service": { "name": "systemd-cryptsetup@luk...da44f\\x2d4b1a\\x2d8c4e\\x2d7eafef8112f3.service", "source": "systemd", "state": "inactive", "status": "generated" }, "systemd-cryptsetup@luks\\x2da2c66ca8\\x2da44f\\x2d4b1a\\x2d8c4e\\x2d7eafef8112f3.service": { "name": "systemd-cryptsetup@luks\\x2da2c66ca8\\x2da44f\\x2d4b1a\\x2d8c4e\\x2d7eafef8112f3.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Monday 20 April 2026 15:55:30 -0400 (0:00:02.731) 0:10:34.300 ********** changed: [managed-node3] => (item=systemd-cryptsetup@luks\x2da2c66ca8\x2da44f\x2d4b1a\x2d8c4e\x2d7eafef8112f3.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2da2c66ca8\\x2da44f\\x2d4b1a\\x2d8c4e\\x2d7eafef8112f3.service", "name": "systemd-cryptsetup@luks\\x2da2c66ca8\\x2da44f\\x2d4b1a\\x2d8c4e\\x2d7eafef8112f3.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "cryptsetup-pre.target system-systemd\\x2dcryptsetup.slice systemd-journald.socket dev-sda.device", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "umount.target cryptsetup.target", "BindsTo": "dev-sda.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-a2c66ca8-a44f-4b1a-8c4e-7eafef8112f3", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-a2c66ca8-a44f-4b1a-8c4e-7eafef8112f3 /dev/sda - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-a2c66ca8-a44f-4b1a-8c4e-7eafef8112f3 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2da2c66ca8\\x2da44f\\x2d4b1a\\x2d8c4e\\x2d7eafef8112f3.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2da2c66ca8\\x2da44f\\x2d4b1a\\x2d8c4e\\x2d7eafef8112f3.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2da2c66ca8\\x2da44f\\x2d4b1a\\x2d8c4e\\x2d7eafef8112f3.service", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Mon 2026-04-20 15:53:53 EDT", "StateChangeTimestampMonotonic": "7195585202", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "infinity", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "WantedBy": "dev-sda.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node3] => (item=systemd-cryptsetup@luk...da44f\x2d4b1a\x2d8c4e\x2d7eafef8112f3.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...da44f\\x2d4b1a\\x2d8c4e\\x2d7eafef8112f3.service", "name": "systemd-cryptsetup@luk...da44f\\x2d4b1a\\x2d8c4e\\x2d7eafef8112f3.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...da44f\\x2d4b1a\\x2d8c4e\\x2d7eafef8112f3.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...da44f\\x2d4b1a\\x2d8c4e\\x2d7eafef8112f3.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.NoSuchUnit \"Unit systemd-cryptsetup@luk...da44f\\x2d4b1a\\x2d8c4e\\x2d7eafef8112f3.service not found.\"", "LoadState": "not-found", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...da44f\\x2d4b1a\\x2d8c4e\\x2d7eafef8112f3.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Monday 20 April 2026 15:55:34 -0400 (0:00:04.014) 0:10:38.315 ********** fatal: [managed-node3]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'luks-36889b12-f60f-49c7-ac51-807f73989f4a' in safe mode due to encryption removal TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129 Monday 20 April 2026 15:55:39 -0400 (0:00:05.608) 0:10:43.923 ********** fatal: [managed-node3]: FAILED! => { "changed": false } MSG: {'msg': "cannot remove existing formatting on device 'luks-36889b12-f60f-49c7-ac51-807f73989f4a' in safe mode due to encryption removal", 'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'invocation': {'module_args': {'pools': [{'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'partition', 'volumes': [{'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None}]}], 'volumes': [], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': True, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Monday 20 April 2026 15:55:40 -0400 (0:00:00.331) 0:10:44.255 ********** changed: [managed-node3] => (item=systemd-cryptsetup@luks\x2da2c66ca8\x2da44f\x2d4b1a\x2d8c4e\x2d7eafef8112f3.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2da2c66ca8\\x2da44f\\x2d4b1a\\x2d8c4e\\x2d7eafef8112f3.service", "name": "systemd-cryptsetup@luks\\x2da2c66ca8\\x2da44f\\x2d4b1a\\x2d8c4e\\x2d7eafef8112f3.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2da2c66ca8\\x2da44f\\x2d4b1a\\x2d8c4e\\x2d7eafef8112f3.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2da2c66ca8\\x2da44f\\x2d4b1a\\x2d8c4e\\x2d7eafef8112f3.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2da2c66ca8\\x2da44f\\x2d4b1a\\x2d8c4e\\x2d7eafef8112f3.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2da2c66ca8\\x2da44f\\x2d4b1a\\x2d8c4e\\x2d7eafef8112f3.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node3] => (item=systemd-cryptsetup@luk...da44f\x2d4b1a\x2d8c4e\x2d7eafef8112f3.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...da44f\\x2d4b1a\\x2d8c4e\\x2d7eafef8112f3.service", "name": "systemd-cryptsetup@luk...da44f\\x2d4b1a\\x2d8c4e\\x2d7eafef8112f3.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...da44f\\x2d4b1a\\x2d8c4e\\x2d7eafef8112f3.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...da44f\\x2d4b1a\\x2d8c4e\\x2d7eafef8112f3.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luk...da44f\\x2d4b1a\\x2d8c4e\\x2d7eafef8112f3.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...da44f\\x2d4b1a\\x2d8c4e\\x2d7eafef8112f3.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:22 Monday 20 April 2026 15:55:43 -0400 (0:00:03.573) 0:10:47.829 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:27 Monday 20 April 2026 15:55:44 -0400 (0:00:00.214) 0:10:48.044 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:38 Monday 20 April 2026 15:55:44 -0400 (0:00:00.304) 0:10:48.348 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Monday 20 April 2026 15:55:44 -0400 (0:00:00.113) 0:10:48.461 ********** ok: [managed-node3] => { "changed": false, "stat": { "atime": 1776714902.9652553, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1776714902.9652553, "dev": 64768, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1776714902.9652553, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "4105449662", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Monday 20 April 2026 15:55:45 -0400 (0:00:01.159) 0:10:49.621 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Remove the encryption layer - 2] ***************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:258 Monday 20 April 2026 15:55:45 -0400 (0:00:00.166) 0:10:49.788 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node3 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Monday 20 April 2026 15:55:46 -0400 (0:00:00.331) 0:10:50.120 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Monday 20 April 2026 15:55:46 -0400 (0:00:00.038) 0:10:50.158 ********** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Monday 20 April 2026 15:55:46 -0400 (0:00:00.039) 0:10:50.197 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Monday 20 April 2026 15:55:46 -0400 (0:00:00.123) 0:10:50.320 ********** ok: [managed-node3] TASK [fedora.linux_system_roles.storage : Record role begin fingerprint] ******* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Monday 20 April 2026 15:55:48 -0400 (0:00:01.755) 0:10:52.076 ********** ok: [managed-node3] => { "changed": false } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:13 Monday 20 April 2026 15:55:49 -0400 (0:00:01.324) 0:10:53.400 ********** skipping: [managed-node3] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node3] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node3] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node3] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:31 Monday 20 April 2026 15:55:49 -0400 (0:00:00.280) 0:10:53.681 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:36 Monday 20 April 2026 15:55:49 -0400 (0:00:00.230) 0:10:53.912 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Monday 20 April 2026 15:55:50 -0400 (0:00:00.196) 0:10:54.108 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Monday 20 April 2026 15:55:50 -0400 (0:00:00.115) 0:10:54.223 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Monday 20 April 2026 15:55:50 -0400 (0:00:00.150) 0:10:54.374 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Monday 20 April 2026 15:55:50 -0400 (0:00:00.421) 0:10:54.796 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Monday 20 April 2026 15:55:51 -0400 (0:00:00.212) 0:10:55.009 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Monday 20 April 2026 15:55:51 -0400 (0:00:00.180) 0:10:55.189 ********** ok: [managed-node3] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Monday 20 April 2026 15:55:54 -0400 (0:00:03.533) 0:10:58.722 ********** ok: [managed-node3] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": false, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Monday 20 April 2026 15:55:54 -0400 (0:00:00.217) 0:10:58.939 ********** ok: [managed-node3] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Monday 20 April 2026 15:55:55 -0400 (0:00:00.200) 0:10:59.140 ********** ok: [managed-node3] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Monday 20 April 2026 15:56:00 -0400 (0:00:05.101) 0:11:04.242 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Monday 20 April 2026 15:56:00 -0400 (0:00:00.293) 0:11:04.535 ********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Monday 20 April 2026 15:56:00 -0400 (0:00:00.110) 0:11:04.646 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Monday 20 April 2026 15:56:00 -0400 (0:00:00.139) 0:11:04.786 ********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Monday 20 April 2026 15:56:00 -0400 (0:00:00.128) 0:11:04.914 ********** ok: [managed-node3] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Monday 20 April 2026 15:56:05 -0400 (0:00:04.522) 0:11:09.437 ********** ok: [managed-node3] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luk...df60f\\x2d49c7\\x2dac51\\x2d807f73989f4a.service": { "name": "systemd-cryptsetup@luk...df60f\\x2d49c7\\x2dac51\\x2d807f73989f4a.service", "source": "systemd", "state": "inactive", "status": "generated" }, "systemd-cryptsetup@luks\\x2d36889b12\\x2df60f\\x2d49c7\\x2dac51\\x2d807f73989f4a.service": { "name": "systemd-cryptsetup@luks\\x2d36889b12\\x2df60f\\x2d49c7\\x2dac51\\x2d807f73989f4a.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Monday 20 April 2026 15:56:08 -0400 (0:00:02.710) 0:11:12.148 ********** changed: [managed-node3] => (item=systemd-cryptsetup@luks\x2d36889b12\x2df60f\x2d49c7\x2dac51\x2d807f73989f4a.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d36889b12\\x2df60f\\x2d49c7\\x2dac51\\x2d807f73989f4a.service", "name": "systemd-cryptsetup@luks\\x2d36889b12\\x2df60f\\x2d49c7\\x2dac51\\x2d807f73989f4a.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "dev-sda1.device systemd-journald.socket cryptsetup-pre.target system-systemd\\x2dcryptsetup.slice", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "umount.target cryptsetup.target", "BindsTo": "dev-sda1.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-36889b12-f60f-49c7-ac51-807f73989f4a", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-36889b12-f60f-49c7-ac51-807f73989f4a /dev/sda1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-36889b12-f60f-49c7-ac51-807f73989f4a ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d36889b12\\x2df60f\\x2d49c7\\x2dac51\\x2d807f73989f4a.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2d36889b12\\x2df60f\\x2d49c7\\x2dac51\\x2d807f73989f4a.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2d36889b12\\x2df60f\\x2d49c7\\x2dac51\\x2d807f73989f4a.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "dev-mapper-luks\\x2d36889b12\\x2df60f\\x2d49c7\\x2dac51\\x2d807f73989f4a.device cryptsetup.target", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Mon 2026-04-20 15:55:34 EDT", "StateChangeTimestampMonotonic": "7296057805", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "infinity", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "WantedBy": "dev-sda1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node3] => (item=systemd-cryptsetup@luk...df60f\x2d49c7\x2dac51\x2d807f73989f4a.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...df60f\\x2d49c7\\x2dac51\\x2d807f73989f4a.service", "name": "systemd-cryptsetup@luk...df60f\\x2d49c7\\x2dac51\\x2d807f73989f4a.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...df60f\\x2d49c7\\x2dac51\\x2d807f73989f4a.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...df60f\\x2d49c7\\x2dac51\\x2d807f73989f4a.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.NoSuchUnit \"Unit systemd-cryptsetup@luk...df60f\\x2d49c7\\x2dac51\\x2d807f73989f4a.service not found.\"", "LoadState": "not-found", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...df60f\\x2d49c7\\x2dac51\\x2d807f73989f4a.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Monday 20 April 2026 15:56:11 -0400 (0:00:03.127) 0:11:15.275 ********** changed: [managed-node3] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-36889b12-f60f-49c7-ac51-807f73989f4a", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-36889b12-f60f-49c7-ac51-807f73989f4a", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-36889b12-f60f-49c7-ac51-807f73989f4a", "password": "-", "state": "absent" } ], "leaves": [ "/dev/sda1", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-36889b12-f60f-49c7-ac51-807f73989f4a", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=895d0975-61ca-4e1b-8227-872de03b853f", "state": "mounted" } ], "packages": [ "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=895d0975-61ca-4e1b-8227-872de03b853f", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:103 Monday 20 April 2026 15:56:17 -0400 (0:00:06.283) 0:11:21.558 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:110 Monday 20 April 2026 15:56:17 -0400 (0:00:00.094) 0:11:21.652 ********** ok: [managed-node3] => { "changed": false, "stat": { "atime": 1776714831.6113968, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "a82d429e447306d2601f0bb357cac552b9ba5a8c", "ctime": 1776714831.6083968, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 73400516, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1776714831.6083968, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1425, "uid": 0, "version": "319406980", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Monday 20 April 2026 15:56:18 -0400 (0:00:01.253) 0:11:22.906 ********** ok: [managed-node3] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Monday 20 April 2026 15:56:20 -0400 (0:00:01.558) 0:11:24.465 ********** changed: [managed-node3] => (item=systemd-cryptsetup@luks\x2d36889b12\x2df60f\x2d49c7\x2dac51\x2d807f73989f4a.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d36889b12\\x2df60f\\x2d49c7\\x2dac51\\x2d807f73989f4a.service", "name": "systemd-cryptsetup@luks\\x2d36889b12\\x2df60f\\x2d49c7\\x2dac51\\x2d807f73989f4a.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d36889b12\\x2df60f\\x2d49c7\\x2dac51\\x2d807f73989f4a.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2d36889b12\\x2df60f\\x2d49c7\\x2dac51\\x2d807f73989f4a.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2d36889b12\\x2df60f\\x2d49c7\\x2dac51\\x2d807f73989f4a.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2d36889b12\\x2df60f\\x2d49c7\\x2dac51\\x2d807f73989f4a.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "RequiredBy": "dev-mapper-luks\\x2d36889b12\\x2df60f\\x2d49c7\\x2dac51\\x2d807f73989f4a.device cryptsetup.target", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Mon 2026-04-20 15:55:34 EDT", "StateChangeTimestampMonotonic": "7296057805", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WantedBy": "dev-sda1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node3] => (item=systemd-cryptsetup@luk...df60f\x2d49c7\x2dac51\x2d807f73989f4a.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...df60f\\x2d49c7\\x2dac51\\x2d807f73989f4a.service", "name": "systemd-cryptsetup@luk...df60f\\x2d49c7\\x2dac51\\x2d807f73989f4a.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...df60f\\x2d49c7\\x2dac51\\x2d807f73989f4a.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...df60f\\x2d49c7\\x2dac51\\x2d807f73989f4a.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luk...df60f\\x2d49c7\\x2dac51\\x2d807f73989f4a.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...df60f\\x2d49c7\\x2dac51\\x2d807f73989f4a.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:139 Monday 20 April 2026 15:56:24 -0400 (0:00:03.688) 0:11:28.153 ********** ok: [managed-node3] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-36889b12-f60f-49c7-ac51-807f73989f4a", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-36889b12-f60f-49c7-ac51-807f73989f4a", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-36889b12-f60f-49c7-ac51-807f73989f4a", "password": "-", "state": "absent" } ], "failed": false, "leaves": [ "/dev/sda1", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-36889b12-f60f-49c7-ac51-807f73989f4a", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=895d0975-61ca-4e1b-8227-872de03b853f", "state": "mounted" } ], "packages": [ "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=895d0975-61ca-4e1b-8227-872de03b853f", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Monday 20 April 2026 15:56:24 -0400 (0:00:00.278) 0:11:28.431 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=895d0975-61ca-4e1b-8227-872de03b853f", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:152 Monday 20 April 2026 15:56:24 -0400 (0:00:00.263) 0:11:28.695 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:168 Monday 20 April 2026 15:56:25 -0400 (0:00:00.335) 0:11:29.030 ********** changed: [managed-node3] => (item={'src': '/dev/mapper/luks-36889b12-f60f-49c7-ac51-807f73989f4a', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-36889b12-f60f-49c7-ac51-807f73989f4a", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-36889b12-f60f-49c7-ac51-807f73989f4a" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:179 Monday 20 April 2026 15:56:26 -0400 (0:00:01.830) 0:11:30.861 ********** ok: [managed-node3] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:184 Monday 20 April 2026 15:56:28 -0400 (0:00:01.985) 0:11:32.847 ********** changed: [managed-node3] => (item={'src': 'UUID=895d0975-61ca-4e1b-8227-872de03b853f', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=895d0975-61ca-4e1b-8227-872de03b853f", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=895d0975-61ca-4e1b-8227-872de03b853f" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Monday 20 April 2026 15:56:30 -0400 (0:00:01.548) 0:11:34.395 ********** skipping: [managed-node3] => (item={'src': 'UUID=895d0975-61ca-4e1b-8227-872de03b853f', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=895d0975-61ca-4e1b-8227-872de03b853f", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:207 Monday 20 April 2026 15:56:30 -0400 (0:00:00.332) 0:11:34.728 ********** ok: [managed-node3] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:215 Monday 20 April 2026 15:56:32 -0400 (0:00:01.885) 0:11:36.614 ********** ok: [managed-node3] => { "changed": false, "stat": { "atime": 1776714848.0803642, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "faee1430581158b769137555f3b56e1458e67f3b", "ctime": 1776714838.492383, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 436207766, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1776714838.492383, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 54, "uid": 0, "version": "1419654061", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:220 Monday 20 April 2026 15:56:34 -0400 (0:00:01.691) 0:11:38.306 ********** changed: [managed-node3] => (item={'backing_device': '/dev/sda1', 'name': 'luks-36889b12-f60f-49c7-ac51-807f73989f4a', 'password': '-', 'state': 'absent'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda1", "name": "luks-36889b12-f60f-49c7-ac51-807f73989f4a", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:242 Monday 20 April 2026 15:56:35 -0400 (0:00:01.592) 0:11:39.899 ********** ok: [managed-node3] TASK [fedora.linux_system_roles.storage : Record role success fingerprint] ***** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:16 Monday 20 April 2026 15:56:37 -0400 (0:00:01.832) 0:11:41.731 ********** ok: [managed-node3] => { "changed": false } TASK [Verify role results - 5] ************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:274 Monday 20 April 2026 15:56:39 -0400 (0:00:01.403) 0:11:43.134 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node3 TASK [Print out pool information] ********************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Monday 20 April 2026 15:56:39 -0400 (0:00:00.680) 0:11:43.814 ********** ok: [managed-node3] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=895d0975-61ca-4e1b-8227-872de03b853f", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Monday 20 April 2026 15:56:40 -0400 (0:00:00.348) 0:11:44.163 ********** skipping: [managed-node3] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Monday 20 April 2026 15:56:40 -0400 (0:00:00.269) 0:11:44.432 ********** ok: [managed-node3] => { "changed": false, "info": { "/dev/sda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/sda1", "size": "4G", "type": "partition", "uuid": "895d0975-61ca-4e1b-8227-872de03b853f" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Monday 20 April 2026 15:56:41 -0400 (0:00:01.481) 0:11:45.914 ********** ok: [managed-node3] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002591", "end": "2026-04-20 15:56:43.199810", "rc": 0, "start": "2026-04-20 15:56:43.197219" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 UUID=895d0975-61ca-4e1b-8227-872de03b853f /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Monday 20 April 2026 15:56:43 -0400 (0:00:01.520) 0:11:47.435 ********** ok: [managed-node3] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002945", "end": "2026-04-20 15:56:44.785142", "failed_when_result": false, "rc": 0, "start": "2026-04-20 15:56:44.782197" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Monday 20 April 2026 15:56:45 -0400 (0:00:01.578) 0:11:49.013 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node3 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Monday 20 April 2026 15:56:45 -0400 (0:00:00.371) 0:11:49.384 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Monday 20 April 2026 15:56:45 -0400 (0:00:00.269) 0:11:49.654 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Monday 20 April 2026 15:56:46 -0400 (0:00:00.329) 0:11:49.983 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Monday 20 April 2026 15:56:46 -0400 (0:00:00.293) 0:11:50.277 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node3 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node3 TASK [Set test variables] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Monday 20 April 2026 15:56:46 -0400 (0:00:00.521) 0:11:50.800 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Monday 20 April 2026 15:56:47 -0400 (0:00:00.404) 0:11:51.204 ********** TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Monday 20 April 2026 15:56:47 -0400 (0:00:00.288) 0:11:51.493 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Monday 20 April 2026 15:56:47 -0400 (0:00:00.313) 0:11:51.806 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Monday 20 April 2026 15:56:48 -0400 (0:00:00.315) 0:11:52.122 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Monday 20 April 2026 15:56:48 -0400 (0:00:00.331) 0:11:52.453 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type - 2] ************************************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Monday 20 April 2026 15:56:48 -0400 (0:00:00.287) 0:11:52.741 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type - 3] ************************************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Monday 20 April 2026 15:56:49 -0400 (0:00:00.328) 0:11:53.070 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:55 Monday 20 April 2026 15:56:49 -0400 (0:00:00.255) 0:11:53.325 ********** TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:68 Monday 20 April 2026 15:56:49 -0400 (0:00:00.129) 0:11:53.455 ********** ok: [managed-node3] => { "changed": false, "failed_when_result": false, "rc": 1 } STDERR: Shared connection to 10.31.13.113 closed. MSG: non-zero return code TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:78 Monday 20 April 2026 15:56:51 -0400 (0:00:01.495) 0:11:54.951 ********** TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:88 Monday 20 April 2026 15:56:51 -0400 (0:00:00.124) 0:11:55.075 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node3 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Monday 20 April 2026 15:56:51 -0400 (0:00:00.459) 0:11:55.535 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Monday 20 April 2026 15:56:51 -0400 (0:00:00.292) 0:11:55.827 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Monday 20 April 2026 15:56:52 -0400 (0:00:00.213) 0:11:56.041 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Monday 20 April 2026 15:56:52 -0400 (0:00:00.196) 0:11:56.238 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Monday 20 April 2026 15:56:52 -0400 (0:00:00.213) 0:11:56.451 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Monday 20 April 2026 15:56:53 -0400 (0:00:00.697) 0:11:57.148 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Monday 20 April 2026 15:56:53 -0400 (0:00:00.251) 0:11:57.400 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Monday 20 April 2026 15:56:53 -0400 (0:00:00.159) 0:11:57.560 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Monday 20 April 2026 15:56:53 -0400 (0:00:00.292) 0:11:57.852 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Monday 20 April 2026 15:56:54 -0400 (0:00:00.266) 0:11:58.119 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Monday 20 April 2026 15:56:54 -0400 (0:00:00.175) 0:11:58.294 ********** ok: [managed-node3] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:91 Monday 20 April 2026 15:56:54 -0400 (0:00:00.179) 0:11:58.474 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node3 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Monday 20 April 2026 15:56:55 -0400 (0:00:00.473) 0:11:58.948 ********** skipping: [managed-node3] => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/sda1', '_raw_device': '/dev/sda1', '_mount_id': 'UUID=895d0975-61ca-4e1b-8227-872de03b853f', '_kernel_device': '/dev/sda1', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_lvmraid_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_lvmraid_volume": { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=895d0975-61ca-4e1b-8227-872de03b853f", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:94 Monday 20 April 2026 15:56:55 -0400 (0:00:00.325) 0:11:59.273 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node3 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Monday 20 April 2026 15:56:55 -0400 (0:00:00.506) 0:11:59.779 ********** skipping: [managed-node3] => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/sda1', '_raw_device': '/dev/sda1', '_mount_id': 'UUID=895d0975-61ca-4e1b-8227-872de03b853f', '_kernel_device': '/dev/sda1', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_thin_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_thin_volume": { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=895d0975-61ca-4e1b-8227-872de03b853f", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check member encryption] ************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:97 Monday 20 April 2026 15:56:55 -0400 (0:00:00.157) 0:11:59.937 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node3 TASK [Set test variables] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Monday 20 April 2026 15:56:56 -0400 (0:00:00.425) 0:12:00.362 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Monday 20 April 2026 15:56:56 -0400 (0:00:00.211) 0:12:00.574 ********** TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Monday 20 April 2026 15:56:56 -0400 (0:00:00.142) 0:12:00.716 ********** TASK [Clear test variables] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Monday 20 April 2026 15:56:56 -0400 (0:00:00.199) 0:12:00.916 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:100 Monday 20 April 2026 15:56:57 -0400 (0:00:00.174) 0:12:01.090 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node3 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Monday 20 April 2026 15:56:57 -0400 (0:00:00.627) 0:12:01.718 ********** skipping: [managed-node3] => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/sda1', '_raw_device': '/dev/sda1', '_mount_id': 'UUID=895d0975-61ca-4e1b-8227-872de03b853f', '_kernel_device': '/dev/sda1', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_vdo_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_vdo_volume": { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=895d0975-61ca-4e1b-8227-872de03b853f", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:103 Monday 20 April 2026 15:56:58 -0400 (0:00:00.314) 0:12:02.032 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node3 TASK [Get stratis pool information] ******************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Monday 20 April 2026 15:56:58 -0400 (0:00:00.446) 0:12:02.478 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print script output] ***************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Monday 20 April 2026 15:56:58 -0400 (0:00:00.313) 0:12:02.792 ********** skipping: [managed-node3] => {} TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:19 Monday 20 April 2026 15:56:59 -0400 (0:00:00.189) 0:12:02.982 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:23 Monday 20 April 2026 15:56:59 -0400 (0:00:00.223) 0:12:03.205 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:30 Monday 20 April 2026 15:56:59 -0400 (0:00:00.218) 0:12:03.424 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:39 Monday 20 April 2026 15:56:59 -0400 (0:00:00.320) 0:12:03.744 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:49 Monday 20 April 2026 15:56:59 -0400 (0:00:00.201) 0:12:03.946 ********** ok: [managed-node3] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:106 Monday 20 April 2026 15:57:00 -0400 (0:00:00.208) 0:12:04.155 ********** ok: [managed-node3] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Monday 20 April 2026 15:57:00 -0400 (0:00:00.215) 0:12:04.370 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node3 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Monday 20 April 2026 15:57:00 -0400 (0:00:00.384) 0:12:04.755 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Monday 20 April 2026 15:57:01 -0400 (0:00:00.316) 0:12:05.071 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node3 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node3 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node3 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node3 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node3 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node3 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node3 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node3 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Monday 20 April 2026 15:57:02 -0400 (0:00:01.347) 0:12:06.418 ********** ok: [managed-node3] => { "ansible_facts": { "storage_test_device_path": "/dev/sda1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Monday 20 April 2026 15:57:02 -0400 (0:00:00.338) 0:12:06.757 ********** ok: [managed-node3] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Monday 20 April 2026 15:57:03 -0400 (0:00:00.342) 0:12:07.099 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Monday 20 April 2026 15:57:03 -0400 (0:00:00.451) 0:12:07.551 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Monday 20 April 2026 15:57:03 -0400 (0:00:00.199) 0:12:07.751 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Monday 20 April 2026 15:57:04 -0400 (0:00:00.216) 0:12:07.967 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Monday 20 April 2026 15:57:04 -0400 (0:00:00.281) 0:12:08.249 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Monday 20 April 2026 15:57:04 -0400 (0:00:00.128) 0:12:08.378 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Monday 20 April 2026 15:57:04 -0400 (0:00:00.156) 0:12:08.534 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Monday 20 April 2026 15:57:04 -0400 (0:00:00.154) 0:12:08.689 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Monday 20 April 2026 15:57:04 -0400 (0:00:00.151) 0:12:08.841 ********** ok: [managed-node3] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Monday 20 April 2026 15:57:05 -0400 (0:00:00.235) 0:12:09.076 ********** ok: [managed-node3] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "UUID=895d0975-61ca-4e1b-8227-872de03b853f " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Monday 20 April 2026 15:57:05 -0400 (0:00:00.472) 0:12:09.549 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Monday 20 April 2026 15:57:05 -0400 (0:00:00.232) 0:12:09.781 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Monday 20 April 2026 15:57:06 -0400 (0:00:00.334) 0:12:10.116 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Monday 20 April 2026 15:57:06 -0400 (0:00:00.322) 0:12:10.438 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Monday 20 April 2026 15:57:06 -0400 (0:00:00.309) 0:12:10.748 ********** ok: [managed-node3] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Monday 20 April 2026 15:57:06 -0400 (0:00:00.168) 0:12:10.916 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Monday 20 April 2026 15:57:07 -0400 (0:00:00.402) 0:12:11.319 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Monday 20 April 2026 15:57:07 -0400 (0:00:00.353) 0:12:11.672 ********** ok: [managed-node3] => { "changed": false, "stat": { "atime": 1776714977.3421144, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1776714977.3421144, "dev": 6, "device_type": 2049, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 251429, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1776714977.3421144, "nlink": 1, "path": "/dev/sda1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Monday 20 April 2026 15:57:09 -0400 (0:00:01.901) 0:12:13.574 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Monday 20 April 2026 15:57:09 -0400 (0:00:00.328) 0:12:13.903 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Monday 20 April 2026 15:57:10 -0400 (0:00:00.198) 0:12:14.101 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Monday 20 April 2026 15:57:10 -0400 (0:00:00.288) 0:12:14.390 ********** ok: [managed-node3] => { "ansible_facts": { "st_volume_type": "partition" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Monday 20 April 2026 15:57:10 -0400 (0:00:00.382) 0:12:14.773 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Monday 20 April 2026 15:57:11 -0400 (0:00:00.204) 0:12:14.977 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Monday 20 April 2026 15:57:11 -0400 (0:00:00.244) 0:12:15.222 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Monday 20 April 2026 15:57:11 -0400 (0:00:00.259) 0:12:15.481 ********** ok: [managed-node3] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Monday 20 April 2026 15:57:15 -0400 (0:00:04.422) 0:12:19.903 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Monday 20 April 2026 15:57:16 -0400 (0:00:00.252) 0:12:20.156 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Monday 20 April 2026 15:57:16 -0400 (0:00:00.323) 0:12:20.479 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Monday 20 April 2026 15:57:16 -0400 (0:00:00.283) 0:12:20.763 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Monday 20 April 2026 15:57:17 -0400 (0:00:00.201) 0:12:20.964 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Monday 20 April 2026 15:57:17 -0400 (0:00:00.152) 0:12:21.117 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Monday 20 April 2026 15:57:17 -0400 (0:00:00.168) 0:12:21.285 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Monday 20 April 2026 15:57:17 -0400 (0:00:00.143) 0:12:21.429 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Monday 20 April 2026 15:57:17 -0400 (0:00:00.235) 0:12:21.664 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Monday 20 April 2026 15:57:17 -0400 (0:00:00.254) 0:12:21.919 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Monday 20 April 2026 15:57:18 -0400 (0:00:00.276) 0:12:22.196 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Monday 20 April 2026 15:57:18 -0400 (0:00:00.275) 0:12:22.471 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Monday 20 April 2026 15:57:18 -0400 (0:00:00.280) 0:12:22.751 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Monday 20 April 2026 15:57:19 -0400 (0:00:00.330) 0:12:23.082 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Monday 20 April 2026 15:57:19 -0400 (0:00:00.249) 0:12:23.331 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Monday 20 April 2026 15:57:19 -0400 (0:00:00.216) 0:12:23.548 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Monday 20 April 2026 15:57:19 -0400 (0:00:00.142) 0:12:23.690 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Monday 20 April 2026 15:57:19 -0400 (0:00:00.179) 0:12:23.870 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Monday 20 April 2026 15:57:20 -0400 (0:00:00.137) 0:12:24.008 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Monday 20 April 2026 15:57:20 -0400 (0:00:00.186) 0:12:24.194 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Monday 20 April 2026 15:57:20 -0400 (0:00:00.194) 0:12:24.389 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Monday 20 April 2026 15:57:20 -0400 (0:00:00.203) 0:12:24.593 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Monday 20 April 2026 15:57:20 -0400 (0:00:00.243) 0:12:24.836 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Monday 20 April 2026 15:57:21 -0400 (0:00:00.354) 0:12:25.191 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Monday 20 April 2026 15:57:21 -0400 (0:00:00.329) 0:12:25.521 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Monday 20 April 2026 15:57:21 -0400 (0:00:00.305) 0:12:25.826 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Monday 20 April 2026 15:57:22 -0400 (0:00:00.283) 0:12:26.110 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Monday 20 April 2026 15:57:22 -0400 (0:00:00.308) 0:12:26.419 ********** ok: [managed-node3] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Monday 20 April 2026 15:57:22 -0400 (0:00:00.360) 0:12:26.780 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Monday 20 April 2026 15:57:23 -0400 (0:00:00.334) 0:12:27.114 ********** skipping: [managed-node3] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Monday 20 April 2026 15:57:23 -0400 (0:00:00.314) 0:12:27.429 ********** skipping: [managed-node3] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Monday 20 April 2026 15:57:23 -0400 (0:00:00.344) 0:12:27.773 ********** skipping: [managed-node3] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Monday 20 April 2026 15:57:24 -0400 (0:00:00.362) 0:12:28.136 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Monday 20 April 2026 15:57:24 -0400 (0:00:00.281) 0:12:28.417 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Monday 20 April 2026 15:57:24 -0400 (0:00:00.269) 0:12:28.687 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Monday 20 April 2026 15:57:25 -0400 (0:00:00.309) 0:12:28.996 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Monday 20 April 2026 15:57:25 -0400 (0:00:00.381) 0:12:29.377 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Monday 20 April 2026 15:57:25 -0400 (0:00:00.302) 0:12:29.680 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Monday 20 April 2026 15:57:26 -0400 (0:00:00.282) 0:12:29.963 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Monday 20 April 2026 15:57:26 -0400 (0:00:00.238) 0:12:30.201 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Monday 20 April 2026 15:57:26 -0400 (0:00:00.243) 0:12:30.445 ********** skipping: [managed-node3] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Monday 20 April 2026 15:57:26 -0400 (0:00:00.302) 0:12:30.747 ********** skipping: [managed-node3] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Monday 20 April 2026 15:57:27 -0400 (0:00:00.298) 0:12:31.046 ********** skipping: [managed-node3] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Monday 20 April 2026 15:57:27 -0400 (0:00:00.214) 0:12:31.260 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Monday 20 April 2026 15:57:27 -0400 (0:00:00.279) 0:12:31.539 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Monday 20 April 2026 15:57:27 -0400 (0:00:00.259) 0:12:31.799 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Monday 20 April 2026 15:57:28 -0400 (0:00:00.224) 0:12:32.024 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Monday 20 April 2026 15:57:28 -0400 (0:00:00.754) 0:12:32.778 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Monday 20 April 2026 15:57:29 -0400 (0:00:00.295) 0:12:33.074 ********** ok: [managed-node3] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Monday 20 April 2026 15:57:29 -0400 (0:00:00.245) 0:12:33.319 ********** ok: [managed-node3] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Monday 20 April 2026 15:57:29 -0400 (0:00:00.298) 0:12:33.617 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Monday 20 April 2026 15:57:29 -0400 (0:00:00.274) 0:12:33.892 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Monday 20 April 2026 15:57:30 -0400 (0:00:00.208) 0:12:34.100 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Monday 20 April 2026 15:57:30 -0400 (0:00:00.241) 0:12:34.342 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Monday 20 April 2026 15:57:30 -0400 (0:00:00.341) 0:12:34.684 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Monday 20 April 2026 15:57:30 -0400 (0:00:00.211) 0:12:34.896 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Monday 20 April 2026 15:57:31 -0400 (0:00:00.385) 0:12:35.281 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Monday 20 April 2026 15:57:31 -0400 (0:00:00.326) 0:12:35.607 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Monday 20 April 2026 15:57:31 -0400 (0:00:00.187) 0:12:35.794 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Monday 20 April 2026 15:57:32 -0400 (0:00:00.167) 0:12:35.962 ********** TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Monday 20 April 2026 15:57:32 -0400 (0:00:00.257) 0:12:36.219 ********** ok: [managed-node3] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Monday 20 April 2026 15:57:32 -0400 (0:00:00.237) 0:12:36.457 ********** changed: [managed-node3] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode - 4] ****************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:280 Monday 20 April 2026 15:57:33 -0400 (0:00:01.404) 0:12:37.862 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node3 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Monday 20 April 2026 15:57:34 -0400 (0:00:00.667) 0:12:38.529 ********** ok: [managed-node3] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Monday 20 April 2026 15:57:34 -0400 (0:00:00.226) 0:12:38.755 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node3 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Monday 20 April 2026 15:57:35 -0400 (0:00:00.396) 0:12:39.152 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Monday 20 April 2026 15:57:35 -0400 (0:00:00.303) 0:12:39.456 ********** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Monday 20 April 2026 15:57:35 -0400 (0:00:00.228) 0:12:39.685 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Monday 20 April 2026 15:57:36 -0400 (0:00:00.349) 0:12:40.034 ********** ok: [managed-node3] TASK [fedora.linux_system_roles.storage : Record role begin fingerprint] ******* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Monday 20 April 2026 15:57:38 -0400 (0:00:02.408) 0:12:42.442 ********** ok: [managed-node3] => { "changed": false } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:13 Monday 20 April 2026 15:57:39 -0400 (0:00:01.294) 0:12:43.736 ********** skipping: [managed-node3] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node3] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node3] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node3] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:31 Monday 20 April 2026 15:57:40 -0400 (0:00:00.326) 0:12:44.063 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:36 Monday 20 April 2026 15:57:40 -0400 (0:00:00.131) 0:12:44.194 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Monday 20 April 2026 15:57:40 -0400 (0:00:00.171) 0:12:44.366 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Monday 20 April 2026 15:57:40 -0400 (0:00:00.190) 0:12:44.557 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Monday 20 April 2026 15:57:40 -0400 (0:00:00.218) 0:12:44.776 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Monday 20 April 2026 15:57:41 -0400 (0:00:00.568) 0:12:45.344 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Monday 20 April 2026 15:57:41 -0400 (0:00:00.423) 0:12:45.768 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Monday 20 April 2026 15:57:42 -0400 (0:00:00.286) 0:12:46.055 ********** ok: [managed-node3] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Monday 20 April 2026 15:57:46 -0400 (0:00:04.258) 0:12:50.313 ********** ok: [managed-node3] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Monday 20 April 2026 15:57:46 -0400 (0:00:00.322) 0:12:50.636 ********** ok: [managed-node3] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Monday 20 April 2026 15:57:47 -0400 (0:00:00.364) 0:12:51.000 ********** ok: [managed-node3] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Monday 20 April 2026 15:57:52 -0400 (0:00:05.525) 0:12:56.526 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Monday 20 April 2026 15:57:52 -0400 (0:00:00.396) 0:12:56.923 ********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Monday 20 April 2026 15:57:53 -0400 (0:00:00.178) 0:12:57.101 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Monday 20 April 2026 15:57:53 -0400 (0:00:00.293) 0:12:57.395 ********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Monday 20 April 2026 15:57:53 -0400 (0:00:00.193) 0:12:57.588 ********** ok: [managed-node3] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Monday 20 April 2026 15:57:57 -0400 (0:00:04.253) 0:13:01.841 ********** ok: [managed-node3] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luk...df60f\\x2d49c7\\x2dac51\\x2d807f73989f4a.service": { "name": "systemd-cryptsetup@luk...df60f\\x2d49c7\\x2dac51\\x2d807f73989f4a.service", "source": "systemd", "state": "inactive", "status": "generated" }, "systemd-cryptsetup@luks\\x2d36889b12\\x2df60f\\x2d49c7\\x2dac51\\x2d807f73989f4a.service": { "name": "systemd-cryptsetup@luks\\x2d36889b12\\x2df60f\\x2d49c7\\x2dac51\\x2d807f73989f4a.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Monday 20 April 2026 15:58:00 -0400 (0:00:02.889) 0:13:04.730 ********** changed: [managed-node3] => (item=systemd-cryptsetup@luks\x2d36889b12\x2df60f\x2d49c7\x2dac51\x2d807f73989f4a.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d36889b12\\x2df60f\\x2d49c7\\x2dac51\\x2d807f73989f4a.service", "name": "systemd-cryptsetup@luks\\x2d36889b12\\x2df60f\\x2d49c7\\x2dac51\\x2d807f73989f4a.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "system-systemd\\x2dcryptsetup.slice systemd-journald.socket dev-sda1.device cryptsetup-pre.target", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "umount.target cryptsetup.target", "BindsTo": "dev-sda1.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-36889b12-f60f-49c7-ac51-807f73989f4a", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-36889b12-f60f-49c7-ac51-807f73989f4a /dev/sda1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-36889b12-f60f-49c7-ac51-807f73989f4a ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d36889b12\\x2df60f\\x2d49c7\\x2dac51\\x2d807f73989f4a.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2d36889b12\\x2df60f\\x2d49c7\\x2dac51\\x2d807f73989f4a.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2d36889b12\\x2df60f\\x2d49c7\\x2dac51\\x2d807f73989f4a.service", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Mon 2026-04-20 15:55:34 EDT", "StateChangeTimestampMonotonic": "7296057805", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "infinity", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "WantedBy": "dev-sda1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node3] => (item=systemd-cryptsetup@luk...df60f\x2d49c7\x2dac51\x2d807f73989f4a.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...df60f\\x2d49c7\\x2dac51\\x2d807f73989f4a.service", "name": "systemd-cryptsetup@luk...df60f\\x2d49c7\\x2dac51\\x2d807f73989f4a.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...df60f\\x2d49c7\\x2dac51\\x2d807f73989f4a.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...df60f\\x2d49c7\\x2dac51\\x2d807f73989f4a.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.NoSuchUnit \"Unit systemd-cryptsetup@luk...df60f\\x2d49c7\\x2dac51\\x2d807f73989f4a.service not found.\"", "LoadState": "not-found", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...df60f\\x2d49c7\\x2dac51\\x2d807f73989f4a.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Monday 20 April 2026 15:58:04 -0400 (0:00:03.519) 0:13:08.250 ********** fatal: [managed-node3]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'sda1' in safe mode due to adding encryption TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129 Monday 20 April 2026 15:58:10 -0400 (0:00:05.983) 0:13:14.234 ********** fatal: [managed-node3]: FAILED! => { "changed": false } MSG: {'msg': "cannot remove existing formatting on device 'sda1' in safe mode due to adding encryption", 'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'invocation': {'module_args': {'pools': [{'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'partition', 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None}]}], 'volumes': [], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': True, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Monday 20 April 2026 15:58:10 -0400 (0:00:00.302) 0:13:14.536 ********** changed: [managed-node3] => (item=systemd-cryptsetup@luks\x2d36889b12\x2df60f\x2d49c7\x2dac51\x2d807f73989f4a.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d36889b12\\x2df60f\\x2d49c7\\x2dac51\\x2d807f73989f4a.service", "name": "systemd-cryptsetup@luks\\x2d36889b12\\x2df60f\\x2d49c7\\x2dac51\\x2d807f73989f4a.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d36889b12\\x2df60f\\x2d49c7\\x2dac51\\x2d807f73989f4a.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2d36889b12\\x2df60f\\x2d49c7\\x2dac51\\x2d807f73989f4a.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2d36889b12\\x2df60f\\x2d49c7\\x2dac51\\x2d807f73989f4a.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2d36889b12\\x2df60f\\x2d49c7\\x2dac51\\x2d807f73989f4a.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node3] => (item=systemd-cryptsetup@luk...df60f\x2d49c7\x2dac51\x2d807f73989f4a.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...df60f\\x2d49c7\\x2dac51\\x2d807f73989f4a.service", "name": "systemd-cryptsetup@luk...df60f\\x2d49c7\\x2dac51\\x2d807f73989f4a.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...df60f\\x2d49c7\\x2dac51\\x2d807f73989f4a.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...df60f\\x2d49c7\\x2dac51\\x2d807f73989f4a.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luk...df60f\\x2d49c7\\x2dac51\\x2d807f73989f4a.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...df60f\\x2d49c7\\x2dac51\\x2d807f73989f4a.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:22 Monday 20 April 2026 15:58:14 -0400 (0:00:03.974) 0:13:18.511 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:27 Monday 20 April 2026 15:58:14 -0400 (0:00:00.356) 0:13:18.868 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:38 Monday 20 April 2026 15:58:15 -0400 (0:00:00.487) 0:13:19.355 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Monday 20 April 2026 15:58:15 -0400 (0:00:00.254) 0:13:19.610 ********** ok: [managed-node3] => { "changed": false, "stat": { "atime": 1776715053.687982, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1776715053.687982, "dev": 2049, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1776715053.687982, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "3528016031", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Monday 20 April 2026 15:58:17 -0400 (0:00:01.565) 0:13:21.175 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Create a key file] ******************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:306 Monday 20 April 2026 15:58:17 -0400 (0:00:00.247) 0:13:21.423 ********** ok: [managed-node3] => { "changed": false, "gid": 0, "group": "root", "mode": "0600", "owner": "root", "path": "/tmp/storage_testnoe9lpp6lukskey", "secontext": "unconfined_u:object_r:user_tmp_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Write the key into the key file] ***************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:313 Monday 20 April 2026 15:58:20 -0400 (0:00:03.138) 0:13:24.561 ********** ok: [managed-node3] => { "changed": false, "checksum": "7a4dff3752e2baf5617c57eaac048e2b95e8af91", "dest": "/tmp/storage_testnoe9lpp6lukskey", "gid": 0, "group": "root", "md5sum": "4ac07b967150835c00d0865161e48744", "mode": "0600", "owner": "root", "secontext": "unconfined_u:object_r:user_tmp_t:s0", "size": 32, "src": "/root/.ansible/tmp/ansible-tmp-1776715100.945285-224495-46739951957049/source", "state": "file", "uid": 0 } TASK [Add encryption to the volume - 2] **************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:320 Monday 20 April 2026 15:58:24 -0400 (0:00:04.211) 0:13:28.773 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node3 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Monday 20 April 2026 15:58:25 -0400 (0:00:00.259) 0:13:29.032 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Monday 20 April 2026 15:58:25 -0400 (0:00:00.231) 0:13:29.264 ********** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Monday 20 April 2026 15:58:25 -0400 (0:00:00.288) 0:13:29.552 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Monday 20 April 2026 15:58:25 -0400 (0:00:00.161) 0:13:29.714 ********** ok: [managed-node3] TASK [fedora.linux_system_roles.storage : Record role begin fingerprint] ******* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Monday 20 April 2026 15:58:28 -0400 (0:00:02.365) 0:13:32.080 ********** ok: [managed-node3] => { "changed": false } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:13 Monday 20 April 2026 15:58:29 -0400 (0:00:01.510) 0:13:33.590 ********** skipping: [managed-node3] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node3] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node3] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node3] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:31 Monday 20 April 2026 15:58:30 -0400 (0:00:00.488) 0:13:34.079 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:36 Monday 20 April 2026 15:58:30 -0400 (0:00:00.272) 0:13:34.351 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Monday 20 April 2026 15:58:30 -0400 (0:00:00.198) 0:13:34.549 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Monday 20 April 2026 15:58:30 -0400 (0:00:00.214) 0:13:34.764 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Monday 20 April 2026 15:58:31 -0400 (0:00:00.189) 0:13:34.953 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Monday 20 April 2026 15:58:31 -0400 (0:00:00.455) 0:13:35.408 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Monday 20 April 2026 15:58:31 -0400 (0:00:00.173) 0:13:35.582 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Monday 20 April 2026 15:58:31 -0400 (0:00:00.208) 0:13:35.791 ********** ok: [managed-node3] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Monday 20 April 2026 15:58:36 -0400 (0:00:04.480) 0:13:40.271 ********** ok: [managed-node3] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": true, "encryption_key": "/tmp/storage_testnoe9lpp6lukskey", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Monday 20 April 2026 15:58:36 -0400 (0:00:00.278) 0:13:40.549 ********** ok: [managed-node3] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Monday 20 April 2026 15:58:36 -0400 (0:00:00.202) 0:13:40.752 ********** ok: [managed-node3] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Monday 20 April 2026 15:58:42 -0400 (0:00:05.734) 0:13:46.487 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Monday 20 April 2026 15:58:42 -0400 (0:00:00.416) 0:13:46.903 ********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Monday 20 April 2026 15:58:43 -0400 (0:00:00.174) 0:13:47.077 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Monday 20 April 2026 15:58:43 -0400 (0:00:00.263) 0:13:47.341 ********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Monday 20 April 2026 15:58:43 -0400 (0:00:00.208) 0:13:47.550 ********** ok: [managed-node3] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Monday 20 April 2026 15:58:48 -0400 (0:00:04.472) 0:13:52.023 ********** ok: [managed-node3] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Monday 20 April 2026 15:58:51 -0400 (0:00:03.139) 0:13:55.162 ********** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Monday 20 April 2026 15:58:51 -0400 (0:00:00.404) 0:13:55.566 ********** changed: [managed-node3] => { "actions": [ { "action": "destroy format", "device": "/dev/sda1", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-46614478-6da8-40f1-a29a-98a8b327ab04", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-46614478-6da8-40f1-a29a-98a8b327ab04", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-46614478-6da8-40f1-a29a-98a8b327ab04", "password": "/tmp/storage_testnoe9lpp6lukskey", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-46614478-6da8-40f1-a29a-98a8b327ab04" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=895d0975-61ca-4e1b-8227-872de03b853f", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-46614478-6da8-40f1-a29a-98a8b327ab04", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-46614478-6da8-40f1-a29a-98a8b327ab04", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-46614478-6da8-40f1-a29a-98a8b327ab04", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_testnoe9lpp6lukskey", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:103 Monday 20 April 2026 15:59:05 -0400 (0:00:14.046) 0:14:09.612 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:110 Monday 20 April 2026 15:59:06 -0400 (0:00:00.345) 0:14:09.958 ********** ok: [managed-node3] => { "changed": false, "stat": { "atime": 1776714990.184092, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "23d652c883995983baed5fb61a5284e67f9c2818", "ctime": 1776714990.1810923, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 73400516, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1776714990.1810923, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1413, "uid": 0, "version": "319406980", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Monday 20 April 2026 15:59:07 -0400 (0:00:01.386) 0:14:11.344 ********** ok: [managed-node3] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Monday 20 April 2026 15:59:09 -0400 (0:00:01.716) 0:14:13.061 ********** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:139 Monday 20 April 2026 15:59:09 -0400 (0:00:00.346) 0:14:13.407 ********** ok: [managed-node3] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/sda1", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-46614478-6da8-40f1-a29a-98a8b327ab04", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-46614478-6da8-40f1-a29a-98a8b327ab04", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-46614478-6da8-40f1-a29a-98a8b327ab04", "password": "/tmp/storage_testnoe9lpp6lukskey", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-46614478-6da8-40f1-a29a-98a8b327ab04" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=895d0975-61ca-4e1b-8227-872de03b853f", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-46614478-6da8-40f1-a29a-98a8b327ab04", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-46614478-6da8-40f1-a29a-98a8b327ab04", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-46614478-6da8-40f1-a29a-98a8b327ab04", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_testnoe9lpp6lukskey", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Monday 20 April 2026 15:59:09 -0400 (0:00:00.340) 0:14:13.748 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-46614478-6da8-40f1-a29a-98a8b327ab04", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-46614478-6da8-40f1-a29a-98a8b327ab04", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_testnoe9lpp6lukskey", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:152 Monday 20 April 2026 15:59:10 -0400 (0:00:00.391) 0:14:14.139 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:168 Monday 20 April 2026 15:59:10 -0400 (0:00:00.262) 0:14:14.402 ********** changed: [managed-node3] => (item={'src': 'UUID=895d0975-61ca-4e1b-8227-872de03b853f', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=895d0975-61ca-4e1b-8227-872de03b853f", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=895d0975-61ca-4e1b-8227-872de03b853f" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:179 Monday 20 April 2026 15:59:12 -0400 (0:00:01.879) 0:14:16.281 ********** ok: [managed-node3] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:184 Monday 20 April 2026 15:59:14 -0400 (0:00:02.218) 0:14:18.500 ********** changed: [managed-node3] => (item={'src': '/dev/mapper/luks-46614478-6da8-40f1-a29a-98a8b327ab04', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-46614478-6da8-40f1-a29a-98a8b327ab04", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-46614478-6da8-40f1-a29a-98a8b327ab04" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Monday 20 April 2026 15:59:16 -0400 (0:00:01.862) 0:14:20.363 ********** skipping: [managed-node3] => (item={'src': '/dev/mapper/luks-46614478-6da8-40f1-a29a-98a8b327ab04', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-46614478-6da8-40f1-a29a-98a8b327ab04", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:207 Monday 20 April 2026 15:59:16 -0400 (0:00:00.319) 0:14:20.682 ********** ok: [managed-node3] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:215 Monday 20 April 2026 15:59:19 -0400 (0:00:02.335) 0:14:23.018 ********** ok: [managed-node3] => { "changed": false, "stat": { "atime": 1776715004.784067, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1776714995.7070825, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 85983428, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1776714995.7060826, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "4034672310", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:220 Monday 20 April 2026 15:59:20 -0400 (0:00:01.613) 0:14:24.632 ********** changed: [managed-node3] => (item={'backing_device': '/dev/sda1', 'name': 'luks-46614478-6da8-40f1-a29a-98a8b327ab04', 'password': '/tmp/storage_testnoe9lpp6lukskey', 'state': 'present'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda1", "name": "luks-46614478-6da8-40f1-a29a-98a8b327ab04", "password": "/tmp/storage_testnoe9lpp6lukskey", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:242 Monday 20 April 2026 15:59:22 -0400 (0:00:01.740) 0:14:26.372 ********** ok: [managed-node3] TASK [fedora.linux_system_roles.storage : Record role success fingerprint] ***** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:16 Monday 20 April 2026 15:59:24 -0400 (0:00:01.676) 0:14:28.049 ********** ok: [managed-node3] => { "changed": false } TASK [Verify role results - 6] ************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:336 Monday 20 April 2026 15:59:25 -0400 (0:00:01.205) 0:14:29.255 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node3 TASK [Print out pool information] ********************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Monday 20 April 2026 15:59:25 -0400 (0:00:00.229) 0:14:29.484 ********** ok: [managed-node3] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-46614478-6da8-40f1-a29a-98a8b327ab04", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-46614478-6da8-40f1-a29a-98a8b327ab04", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_testnoe9lpp6lukskey", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Monday 20 April 2026 15:59:25 -0400 (0:00:00.239) 0:14:29.723 ********** skipping: [managed-node3] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Monday 20 April 2026 15:59:26 -0400 (0:00:00.259) 0:14:29.983 ********** ok: [managed-node3] => { "changed": false, "info": { "/dev/mapper/luks-46614478-6da8-40f1-a29a-98a8b327ab04": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-46614478-6da8-40f1-a29a-98a8b327ab04", "size": "4G", "type": "crypt", "uuid": "af0226a6-545f-4c73-9f5e-45934a26bf98" }, "/dev/sda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/sda1", "size": "4G", "type": "partition", "uuid": "46614478-6da8-40f1-a29a-98a8b327ab04" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Monday 20 April 2026 15:59:27 -0400 (0:00:01.743) 0:14:31.726 ********** ok: [managed-node3] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003081", "end": "2026-04-20 15:59:29.245003", "rc": 0, "start": "2026-04-20 15:59:29.241922" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-46614478-6da8-40f1-a29a-98a8b327ab04 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Monday 20 April 2026 15:59:29 -0400 (0:00:01.760) 0:14:33.487 ********** ok: [managed-node3] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002966", "end": "2026-04-20 15:59:30.985151", "failed_when_result": false, "rc": 0, "start": "2026-04-20 15:59:30.982185" } STDOUT: luks-46614478-6da8-40f1-a29a-98a8b327ab04 /dev/sda1 /tmp/storage_testnoe9lpp6lukskey TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Monday 20 April 2026 15:59:31 -0400 (0:00:01.730) 0:14:35.217 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node3 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Monday 20 April 2026 15:59:31 -0400 (0:00:00.424) 0:14:35.642 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Monday 20 April 2026 15:59:31 -0400 (0:00:00.300) 0:14:35.942 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Monday 20 April 2026 15:59:32 -0400 (0:00:00.384) 0:14:36.327 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Monday 20 April 2026 15:59:32 -0400 (0:00:00.343) 0:14:36.670 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node3 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node3 TASK [Set test variables] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Monday 20 April 2026 15:59:33 -0400 (0:00:00.690) 0:14:37.360 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Monday 20 April 2026 15:59:33 -0400 (0:00:00.316) 0:14:37.677 ********** TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Monday 20 April 2026 15:59:33 -0400 (0:00:00.239) 0:14:37.917 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Monday 20 April 2026 15:59:34 -0400 (0:00:00.244) 0:14:38.161 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Monday 20 April 2026 15:59:34 -0400 (0:00:00.191) 0:14:38.352 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Monday 20 April 2026 15:59:34 -0400 (0:00:00.184) 0:14:38.537 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type - 2] ************************************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Monday 20 April 2026 15:59:34 -0400 (0:00:00.260) 0:14:38.798 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type - 3] ************************************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Monday 20 April 2026 15:59:35 -0400 (0:00:00.336) 0:14:39.134 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:55 Monday 20 April 2026 15:59:35 -0400 (0:00:00.303) 0:14:39.438 ********** TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:68 Monday 20 April 2026 15:59:35 -0400 (0:00:00.217) 0:14:39.656 ********** ok: [managed-node3] => { "changed": false, "failed_when_result": false, "rc": 1 } STDERR: Shared connection to 10.31.13.113 closed. MSG: non-zero return code TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:78 Monday 20 April 2026 15:59:37 -0400 (0:00:01.724) 0:14:41.380 ********** TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:88 Monday 20 April 2026 15:59:37 -0400 (0:00:00.281) 0:14:41.661 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node3 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Monday 20 April 2026 15:59:38 -0400 (0:00:00.488) 0:14:42.150 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Monday 20 April 2026 15:59:38 -0400 (0:00:00.255) 0:14:42.406 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Monday 20 April 2026 15:59:38 -0400 (0:00:00.219) 0:14:42.625 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Monday 20 April 2026 15:59:38 -0400 (0:00:00.271) 0:14:42.897 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Monday 20 April 2026 15:59:39 -0400 (0:00:00.345) 0:14:43.243 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Monday 20 April 2026 15:59:39 -0400 (0:00:00.299) 0:14:43.542 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Monday 20 April 2026 15:59:39 -0400 (0:00:00.312) 0:14:43.855 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Monday 20 April 2026 15:59:40 -0400 (0:00:00.272) 0:14:44.128 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Monday 20 April 2026 15:59:40 -0400 (0:00:00.264) 0:14:44.392 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Monday 20 April 2026 15:59:40 -0400 (0:00:00.324) 0:14:44.716 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Monday 20 April 2026 15:59:41 -0400 (0:00:00.299) 0:14:45.016 ********** ok: [managed-node3] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:91 Monday 20 April 2026 15:59:41 -0400 (0:00:00.318) 0:14:45.334 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node3 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Monday 20 April 2026 15:59:42 -0400 (0:00:00.627) 0:14:45.962 ********** skipping: [managed-node3] => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': '/tmp/storage_testnoe9lpp6lukskey', 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-46614478-6da8-40f1-a29a-98a8b327ab04', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-46614478-6da8-40f1-a29a-98a8b327ab04', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_lvmraid_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_lvmraid_volume": { "_device": "/dev/mapper/luks-46614478-6da8-40f1-a29a-98a8b327ab04", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-46614478-6da8-40f1-a29a-98a8b327ab04", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_testnoe9lpp6lukskey", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:94 Monday 20 April 2026 15:59:42 -0400 (0:00:00.400) 0:14:46.363 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node3 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Monday 20 April 2026 15:59:42 -0400 (0:00:00.546) 0:14:46.909 ********** skipping: [managed-node3] => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': '/tmp/storage_testnoe9lpp6lukskey', 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-46614478-6da8-40f1-a29a-98a8b327ab04', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-46614478-6da8-40f1-a29a-98a8b327ab04', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_thin_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_thin_volume": { "_device": "/dev/mapper/luks-46614478-6da8-40f1-a29a-98a8b327ab04", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-46614478-6da8-40f1-a29a-98a8b327ab04", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_testnoe9lpp6lukskey", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check member encryption] ************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:97 Monday 20 April 2026 15:59:43 -0400 (0:00:00.174) 0:14:47.084 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node3 TASK [Set test variables] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Monday 20 April 2026 15:59:43 -0400 (0:00:00.478) 0:14:47.562 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Monday 20 April 2026 15:59:44 -0400 (0:00:01.068) 0:14:48.630 ********** TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Monday 20 April 2026 15:59:44 -0400 (0:00:00.094) 0:14:48.725 ********** TASK [Clear test variables] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Monday 20 April 2026 15:59:44 -0400 (0:00:00.213) 0:14:48.939 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:100 Monday 20 April 2026 15:59:45 -0400 (0:00:00.180) 0:14:49.119 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node3 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Monday 20 April 2026 15:59:45 -0400 (0:00:00.685) 0:14:49.804 ********** skipping: [managed-node3] => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': '/tmp/storage_testnoe9lpp6lukskey', 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-46614478-6da8-40f1-a29a-98a8b327ab04', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-46614478-6da8-40f1-a29a-98a8b327ab04', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_vdo_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_vdo_volume": { "_device": "/dev/mapper/luks-46614478-6da8-40f1-a29a-98a8b327ab04", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-46614478-6da8-40f1-a29a-98a8b327ab04", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_testnoe9lpp6lukskey", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:103 Monday 20 April 2026 15:59:46 -0400 (0:00:00.435) 0:14:50.240 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node3 TASK [Get stratis pool information] ******************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Monday 20 April 2026 15:59:46 -0400 (0:00:00.379) 0:14:50.620 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print script output] ***************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Monday 20 April 2026 15:59:46 -0400 (0:00:00.239) 0:14:50.860 ********** skipping: [managed-node3] => {} TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:19 Monday 20 April 2026 15:59:47 -0400 (0:00:00.249) 0:14:51.110 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:23 Monday 20 April 2026 15:59:47 -0400 (0:00:00.345) 0:14:51.455 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:30 Monday 20 April 2026 15:59:47 -0400 (0:00:00.299) 0:14:51.755 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:39 Monday 20 April 2026 15:59:48 -0400 (0:00:00.302) 0:14:52.058 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:49 Monday 20 April 2026 15:59:48 -0400 (0:00:00.325) 0:14:52.383 ********** ok: [managed-node3] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:106 Monday 20 April 2026 15:59:48 -0400 (0:00:00.266) 0:14:52.650 ********** ok: [managed-node3] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Monday 20 April 2026 15:59:49 -0400 (0:00:00.347) 0:14:52.997 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node3 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Monday 20 April 2026 15:59:49 -0400 (0:00:00.436) 0:14:53.433 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Monday 20 April 2026 15:59:49 -0400 (0:00:00.303) 0:14:53.736 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node3 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node3 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node3 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node3 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node3 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node3 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node3 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node3 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Monday 20 April 2026 15:59:50 -0400 (0:00:01.174) 0:14:54.911 ********** ok: [managed-node3] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-46614478-6da8-40f1-a29a-98a8b327ab04" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Monday 20 April 2026 15:59:51 -0400 (0:00:00.300) 0:14:55.211 ********** ok: [managed-node3] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Monday 20 April 2026 15:59:51 -0400 (0:00:00.219) 0:14:55.431 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Monday 20 April 2026 15:59:51 -0400 (0:00:00.309) 0:14:55.740 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Monday 20 April 2026 15:59:51 -0400 (0:00:00.178) 0:14:55.918 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Monday 20 April 2026 15:59:52 -0400 (0:00:00.207) 0:14:56.125 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Monday 20 April 2026 15:59:52 -0400 (0:00:00.181) 0:14:56.306 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Monday 20 April 2026 15:59:52 -0400 (0:00:00.214) 0:14:56.521 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Monday 20 April 2026 15:59:52 -0400 (0:00:00.134) 0:14:56.656 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Monday 20 April 2026 15:59:52 -0400 (0:00:00.181) 0:14:56.838 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Monday 20 April 2026 15:59:53 -0400 (0:00:00.276) 0:14:57.114 ********** ok: [managed-node3] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Monday 20 April 2026 15:59:53 -0400 (0:00:00.293) 0:14:57.408 ********** ok: [managed-node3] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-46614478-6da8-40f1-a29a-98a8b327ab04 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Monday 20 April 2026 15:59:54 -0400 (0:00:00.609) 0:14:58.017 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Monday 20 April 2026 15:59:54 -0400 (0:00:00.287) 0:14:58.305 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Monday 20 April 2026 15:59:54 -0400 (0:00:00.287) 0:14:58.593 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Monday 20 April 2026 15:59:54 -0400 (0:00:00.220) 0:14:58.813 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Monday 20 April 2026 15:59:55 -0400 (0:00:00.341) 0:14:59.155 ********** ok: [managed-node3] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Monday 20 April 2026 15:59:55 -0400 (0:00:00.273) 0:14:59.428 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Monday 20 April 2026 15:59:55 -0400 (0:00:00.292) 0:14:59.720 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Monday 20 April 2026 15:59:56 -0400 (0:00:00.465) 0:15:00.186 ********** ok: [managed-node3] => { "changed": false, "stat": { "atime": 1776715145.1648235, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1776715145.1648235, "dev": 6, "device_type": 2049, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 251429, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1776715145.1648235, "nlink": 1, "path": "/dev/sda1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Monday 20 April 2026 15:59:58 -0400 (0:00:01.823) 0:15:02.010 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Monday 20 April 2026 15:59:58 -0400 (0:00:00.353) 0:15:02.363 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Monday 20 April 2026 15:59:58 -0400 (0:00:00.177) 0:15:02.541 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Monday 20 April 2026 15:59:58 -0400 (0:00:00.227) 0:15:02.769 ********** ok: [managed-node3] => { "ansible_facts": { "st_volume_type": "partition" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Monday 20 April 2026 15:59:59 -0400 (0:00:00.229) 0:15:02.999 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Monday 20 April 2026 15:59:59 -0400 (0:00:00.304) 0:15:03.303 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Monday 20 April 2026 15:59:59 -0400 (0:00:00.120) 0:15:03.424 ********** ok: [managed-node3] => { "changed": false, "stat": { "atime": 1776715145.3178234, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1776715145.3178234, "dev": 6, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 268226, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1776715145.3178234, "nlink": 1, "path": "/dev/mapper/luks-46614478-6da8-40f1-a29a-98a8b327ab04", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Monday 20 April 2026 16:00:01 -0400 (0:00:01.727) 0:15:05.151 ********** ok: [managed-node3] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Monday 20 April 2026 16:00:05 -0400 (0:00:04.057) 0:15:09.208 ********** ok: [managed-node3] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/sda1" ], "delta": "0:00:00.010099", "end": "2026-04-20 16:00:06.256071", "rc": 0, "start": "2026-04-20 16:00:06.245972" } STDOUT: LUKS header information Version: 2 Epoch: 3 Metadata area: 16384 [bytes] Keyslots area: 16744448 [bytes] UUID: 46614478-6da8-40f1-a29a-98a8b327ab04 Label: (no label) Subsystem: (no subsystem) Flags: (no flags) Data segments: 0: crypt offset: 16777216 [bytes] length: (whole device) cipher: aes-xts-plain64 sector: 512 [bytes] Keyslots: 0: luks2 Key: 512 bits Priority: normal Cipher: aes-xts-plain64 Cipher key: 512 bits PBKDF: argon2i Time cost: 4 Memory: 924187 Threads: 2 Salt: ac 50 57 48 a4 90 08 f6 7f 9d ab ef 30 f0 3b 4f 47 e8 b7 df af 93 d1 11 b8 3f 38 08 5b 40 a7 ef AF stripes: 4000 AF hash: sha256 Area offset:32768 [bytes] Area length:258048 [bytes] Digest ID: 0 Tokens: Digests: 0: pbkdf2 Hash: sha256 Iterations: 120249 Salt: e7 fe 7c aa e8 c7 c1 cd 18 63 26 ce 9e 99 1c 33 11 c6 56 ee 5c fc 43 ef e3 72 ba 2f 38 7f ed 15 Digest: c8 5c 8a b6 a8 00 5f 06 7f 6c 58 a2 24 db 32 28 48 55 fa 18 3b 88 e0 d2 29 1f 34 8c 93 f1 4c 92 TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Monday 20 April 2026 16:00:06 -0400 (0:00:01.162) 0:15:10.371 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Monday 20 April 2026 16:00:06 -0400 (0:00:00.302) 0:15:10.674 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Monday 20 April 2026 16:00:06 -0400 (0:00:00.268) 0:15:10.942 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Monday 20 April 2026 16:00:07 -0400 (0:00:00.207) 0:15:11.149 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Monday 20 April 2026 16:00:07 -0400 (0:00:00.237) 0:15:11.387 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Monday 20 April 2026 16:00:07 -0400 (0:00:00.284) 0:15:11.672 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Monday 20 April 2026 16:00:07 -0400 (0:00:00.211) 0:15:11.883 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Monday 20 April 2026 16:00:08 -0400 (0:00:00.212) 0:15:12.095 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-46614478-6da8-40f1-a29a-98a8b327ab04 /dev/sda1 /tmp/storage_testnoe9lpp6lukskey" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "/tmp/storage_testnoe9lpp6lukskey" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Monday 20 April 2026 16:00:08 -0400 (0:00:00.289) 0:15:12.384 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Monday 20 April 2026 16:00:08 -0400 (0:00:00.306) 0:15:12.691 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Monday 20 April 2026 16:00:09 -0400 (0:00:00.386) 0:15:13.077 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Monday 20 April 2026 16:00:09 -0400 (0:00:00.404) 0:15:13.481 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Monday 20 April 2026 16:00:10 -0400 (0:00:00.468) 0:15:13.949 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Monday 20 April 2026 16:00:10 -0400 (0:00:00.265) 0:15:14.215 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Monday 20 April 2026 16:00:10 -0400 (0:00:00.195) 0:15:14.411 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Monday 20 April 2026 16:00:10 -0400 (0:00:00.293) 0:15:14.705 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Monday 20 April 2026 16:00:11 -0400 (0:00:00.262) 0:15:14.968 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Monday 20 April 2026 16:00:11 -0400 (0:00:00.306) 0:15:15.274 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Monday 20 April 2026 16:00:11 -0400 (0:00:00.171) 0:15:15.445 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Monday 20 April 2026 16:00:11 -0400 (0:00:00.220) 0:15:15.666 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Monday 20 April 2026 16:00:11 -0400 (0:00:00.252) 0:15:15.918 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Monday 20 April 2026 16:00:12 -0400 (0:00:00.256) 0:15:16.175 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Monday 20 April 2026 16:00:12 -0400 (0:00:00.290) 0:15:16.465 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Monday 20 April 2026 16:00:12 -0400 (0:00:00.297) 0:15:16.763 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Monday 20 April 2026 16:00:13 -0400 (0:00:00.195) 0:15:16.959 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Monday 20 April 2026 16:00:13 -0400 (0:00:00.302) 0:15:17.261 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Monday 20 April 2026 16:00:13 -0400 (0:00:00.305) 0:15:17.567 ********** ok: [managed-node3] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Monday 20 April 2026 16:00:13 -0400 (0:00:00.284) 0:15:17.851 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Monday 20 April 2026 16:00:14 -0400 (0:00:00.216) 0:15:18.068 ********** skipping: [managed-node3] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Monday 20 April 2026 16:00:14 -0400 (0:00:00.301) 0:15:18.369 ********** skipping: [managed-node3] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Monday 20 April 2026 16:00:14 -0400 (0:00:00.202) 0:15:18.572 ********** skipping: [managed-node3] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Monday 20 April 2026 16:00:14 -0400 (0:00:00.231) 0:15:18.804 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Monday 20 April 2026 16:00:15 -0400 (0:00:00.187) 0:15:18.991 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Monday 20 April 2026 16:00:15 -0400 (0:00:00.229) 0:15:19.221 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Monday 20 April 2026 16:00:15 -0400 (0:00:00.112) 0:15:19.333 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Monday 20 April 2026 16:00:15 -0400 (0:00:00.262) 0:15:19.596 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Monday 20 April 2026 16:00:15 -0400 (0:00:00.190) 0:15:19.786 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Monday 20 April 2026 16:00:16 -0400 (0:00:00.283) 0:15:20.069 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Monday 20 April 2026 16:00:16 -0400 (0:00:00.270) 0:15:20.340 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Monday 20 April 2026 16:00:16 -0400 (0:00:00.344) 0:15:20.684 ********** skipping: [managed-node3] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Monday 20 April 2026 16:00:16 -0400 (0:00:00.208) 0:15:20.893 ********** skipping: [managed-node3] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Monday 20 April 2026 16:00:17 -0400 (0:00:00.280) 0:15:21.173 ********** skipping: [managed-node3] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Monday 20 April 2026 16:00:17 -0400 (0:00:00.228) 0:15:21.401 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Monday 20 April 2026 16:00:17 -0400 (0:00:00.144) 0:15:21.546 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Monday 20 April 2026 16:00:17 -0400 (0:00:00.283) 0:15:21.830 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Monday 20 April 2026 16:00:18 -0400 (0:00:00.270) 0:15:22.100 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Monday 20 April 2026 16:00:18 -0400 (0:00:00.271) 0:15:22.371 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Monday 20 April 2026 16:00:18 -0400 (0:00:00.250) 0:15:22.622 ********** ok: [managed-node3] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Monday 20 April 2026 16:00:18 -0400 (0:00:00.278) 0:15:22.900 ********** ok: [managed-node3] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Monday 20 April 2026 16:00:19 -0400 (0:00:00.170) 0:15:23.071 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Monday 20 April 2026 16:00:19 -0400 (0:00:00.176) 0:15:23.247 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Monday 20 April 2026 16:00:19 -0400 (0:00:00.187) 0:15:23.435 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Monday 20 April 2026 16:00:19 -0400 (0:00:00.172) 0:15:23.607 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Monday 20 April 2026 16:00:19 -0400 (0:00:00.197) 0:15:23.804 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Monday 20 April 2026 16:00:19 -0400 (0:00:00.141) 0:15:23.946 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Monday 20 April 2026 16:00:20 -0400 (0:00:00.215) 0:15:24.162 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Monday 20 April 2026 16:00:20 -0400 (0:00:00.184) 0:15:24.347 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Monday 20 April 2026 16:00:20 -0400 (0:00:00.208) 0:15:24.555 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Monday 20 April 2026 16:00:20 -0400 (0:00:00.164) 0:15:24.720 ********** TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Monday 20 April 2026 16:00:20 -0400 (0:00:00.131) 0:15:24.852 ********** ok: [managed-node3] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Remove the key file] ***************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:342 Monday 20 April 2026 16:00:21 -0400 (0:00:00.197) 0:15:25.050 ********** ok: [managed-node3] => { "changed": false, "path": "/tmp/storage_testnoe9lpp6lukskey", "state": "absent" } TASK [Test for correct handling of new encrypted volume w/ no key - 3] ********* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:352 Monday 20 April 2026 16:00:22 -0400 (0:00:01.342) 0:15:26.392 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node3 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Monday 20 April 2026 16:00:22 -0400 (0:00:00.181) 0:15:26.574 ********** ok: [managed-node3] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Monday 20 April 2026 16:00:22 -0400 (0:00:00.226) 0:15:26.800 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node3 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Monday 20 April 2026 16:00:23 -0400 (0:00:00.233) 0:15:27.034 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Monday 20 April 2026 16:00:23 -0400 (0:00:00.189) 0:15:27.223 ********** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Monday 20 April 2026 16:00:23 -0400 (0:00:00.213) 0:15:27.436 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Monday 20 April 2026 16:00:23 -0400 (0:00:00.229) 0:15:27.666 ********** ok: [managed-node3] TASK [fedora.linux_system_roles.storage : Record role begin fingerprint] ******* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Monday 20 April 2026 16:00:25 -0400 (0:00:02.188) 0:15:29.854 ********** ok: [managed-node3] => { "changed": false } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:13 Monday 20 April 2026 16:00:28 -0400 (0:00:02.574) 0:15:32.428 ********** skipping: [managed-node3] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node3] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node3] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node3] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:31 Monday 20 April 2026 16:00:28 -0400 (0:00:00.420) 0:15:32.850 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:36 Monday 20 April 2026 16:00:29 -0400 (0:00:00.221) 0:15:33.071 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Monday 20 April 2026 16:00:29 -0400 (0:00:00.221) 0:15:33.293 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Monday 20 April 2026 16:00:29 -0400 (0:00:00.201) 0:15:33.494 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Monday 20 April 2026 16:00:29 -0400 (0:00:00.145) 0:15:33.639 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Monday 20 April 2026 16:00:30 -0400 (0:00:00.448) 0:15:34.087 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Monday 20 April 2026 16:00:30 -0400 (0:00:00.218) 0:15:34.306 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Monday 20 April 2026 16:00:30 -0400 (0:00:00.241) 0:15:34.548 ********** ok: [managed-node3] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Monday 20 April 2026 16:00:35 -0400 (0:00:04.670) 0:15:39.219 ********** ok: [managed-node3] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": true, "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Monday 20 April 2026 16:00:35 -0400 (0:00:00.200) 0:15:39.420 ********** ok: [managed-node3] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Monday 20 April 2026 16:00:35 -0400 (0:00:00.212) 0:15:39.632 ********** ok: [managed-node3] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup", "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Monday 20 April 2026 16:00:41 -0400 (0:00:05.577) 0:15:45.210 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Monday 20 April 2026 16:00:41 -0400 (0:00:00.234) 0:15:45.444 ********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Monday 20 April 2026 16:00:41 -0400 (0:00:00.202) 0:15:45.647 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Monday 20 April 2026 16:00:41 -0400 (0:00:00.240) 0:15:45.887 ********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Monday 20 April 2026 16:00:42 -0400 (0:00:00.216) 0:15:46.103 ********** ok: [managed-node3] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Monday 20 April 2026 16:00:46 -0400 (0:00:04.441) 0:15:50.545 ********** ok: [managed-node3] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Monday 20 April 2026 16:00:49 -0400 (0:00:02.924) 0:15:53.469 ********** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Monday 20 April 2026 16:00:49 -0400 (0:00:00.431) 0:15:53.900 ********** fatal: [managed-node3]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: encrypted volume 'test1' missing key/password TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129 Monday 20 April 2026 16:00:55 -0400 (0:00:05.454) 0:15:59.355 ********** fatal: [managed-node3]: FAILED! => { "changed": false } MSG: {'msg': "encrypted volume 'test1' missing key/password", 'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'invocation': {'module_args': {'pools': [{'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'lvm', 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None}]}], 'volumes': [], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': False, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Monday 20 April 2026 16:00:55 -0400 (0:00:00.242) 0:15:59.598 ********** TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:22 Monday 20 April 2026 16:00:56 -0400 (0:00:00.460) 0:16:00.058 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:27 Monday 20 April 2026 16:00:56 -0400 (0:00:00.156) 0:16:00.214 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:38 Monday 20 April 2026 16:00:56 -0400 (0:00:00.221) 0:16:00.436 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Create an encrypted lvm volume w/ default fs] **************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:370 Monday 20 April 2026 16:00:56 -0400 (0:00:00.185) 0:16:00.622 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node3 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Monday 20 April 2026 16:00:56 -0400 (0:00:00.134) 0:16:00.756 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Monday 20 April 2026 16:00:56 -0400 (0:00:00.190) 0:16:00.947 ********** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Monday 20 April 2026 16:00:57 -0400 (0:00:00.177) 0:16:01.124 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Monday 20 April 2026 16:00:57 -0400 (0:00:00.122) 0:16:01.246 ********** ok: [managed-node3] TASK [fedora.linux_system_roles.storage : Record role begin fingerprint] ******* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Monday 20 April 2026 16:00:59 -0400 (0:00:01.979) 0:16:03.226 ********** ok: [managed-node3] => { "changed": false } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:13 Monday 20 April 2026 16:01:00 -0400 (0:00:01.510) 0:16:04.737 ********** skipping: [managed-node3] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node3] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node3] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node3] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:31 Monday 20 April 2026 16:01:01 -0400 (0:00:00.461) 0:16:05.199 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:36 Monday 20 April 2026 16:01:01 -0400 (0:00:00.271) 0:16:05.470 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Monday 20 April 2026 16:01:01 -0400 (0:00:00.278) 0:16:05.749 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Monday 20 April 2026 16:01:02 -0400 (0:00:00.221) 0:16:05.970 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Monday 20 April 2026 16:01:02 -0400 (0:00:00.266) 0:16:06.236 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Monday 20 April 2026 16:01:02 -0400 (0:00:00.618) 0:16:06.854 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Monday 20 April 2026 16:01:03 -0400 (0:00:00.296) 0:16:07.150 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Monday 20 April 2026 16:01:03 -0400 (0:00:00.273) 0:16:07.424 ********** ok: [managed-node3] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Monday 20 April 2026 16:01:08 -0400 (0:00:04.732) 0:16:12.156 ********** ok: [managed-node3] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": true, "encryption_cipher": "aes-xts-plain64", "encryption_key_size": 512, "encryption_luks_version": "luks1", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Monday 20 April 2026 16:01:08 -0400 (0:00:00.404) 0:16:12.561 ********** ok: [managed-node3] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Monday 20 April 2026 16:01:08 -0400 (0:00:00.327) 0:16:12.889 ********** ok: [managed-node3] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup", "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Monday 20 April 2026 16:01:14 -0400 (0:00:05.511) 0:16:18.401 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Monday 20 April 2026 16:01:14 -0400 (0:00:00.391) 0:16:18.792 ********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Monday 20 April 2026 16:01:14 -0400 (0:00:00.135) 0:16:18.928 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Monday 20 April 2026 16:01:15 -0400 (0:00:00.197) 0:16:19.125 ********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Monday 20 April 2026 16:01:15 -0400 (0:00:00.133) 0:16:19.259 ********** ok: [managed-node3] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Monday 20 April 2026 16:01:19 -0400 (0:00:04.286) 0:16:23.546 ********** ok: [managed-node3] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Monday 20 April 2026 16:01:22 -0400 (0:00:03.161) 0:16:26.707 ********** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Monday 20 April 2026 16:01:23 -0400 (0:00:00.436) 0:16:27.143 ********** changed: [managed-node3] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-46614478-6da8-40f1-a29a-98a8b327ab04", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-46614478-6da8-40f1-a29a-98a8b327ab04", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "destroy device", "device": "/dev/sda1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create format", "device": "/dev/sda", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/foo", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-54f1b36d-4103-4a30-8816-49506ecb6f36", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-54f1b36d-4103-4a30-8816-49506ecb6f36", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-46614478-6da8-40f1-a29a-98a8b327ab04", "password": "-", "state": "absent" }, { "backing_device": "/dev/mapper/foo-test1", "name": "luks-54f1b36d-4103-4a30-8816-49506ecb6f36", "password": "-", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-54f1b36d-4103-4a30-8816-49506ecb6f36" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-46614478-6da8-40f1-a29a-98a8b327ab04", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-54f1b36d-4103-4a30-8816-49506ecb6f36", "state": "mounted" } ], "packages": [ "cryptsetup", "lvm2", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-54f1b36d-4103-4a30-8816-49506ecb6f36", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-54f1b36d-4103-4a30-8816-49506ecb6f36", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": "aes-xts-plain64", "encryption_key": null, "encryption_key_size": 512, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:103 Monday 20 April 2026 16:01:35 -0400 (0:00:12.395) 0:16:39.539 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:110 Monday 20 April 2026 16:01:35 -0400 (0:00:00.193) 0:16:39.732 ********** ok: [managed-node3] => { "changed": false, "stat": { "atime": 1776715156.1858044, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "6e03b444caab2fdde4d326944748c71ac51338d3", "ctime": 1776715156.1828046, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 73400516, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1776715156.1828046, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1425, "uid": 0, "version": "319406980", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Monday 20 April 2026 16:01:37 -0400 (0:00:01.266) 0:16:40.998 ********** ok: [managed-node3] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Monday 20 April 2026 16:01:38 -0400 (0:00:01.572) 0:16:42.571 ********** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:139 Monday 20 April 2026 16:01:39 -0400 (0:00:00.420) 0:16:42.991 ********** ok: [managed-node3] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-46614478-6da8-40f1-a29a-98a8b327ab04", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-46614478-6da8-40f1-a29a-98a8b327ab04", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "destroy device", "device": "/dev/sda1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create format", "device": "/dev/sda", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/foo", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-54f1b36d-4103-4a30-8816-49506ecb6f36", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-54f1b36d-4103-4a30-8816-49506ecb6f36", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-46614478-6da8-40f1-a29a-98a8b327ab04", "password": "-", "state": "absent" }, { "backing_device": "/dev/mapper/foo-test1", "name": "luks-54f1b36d-4103-4a30-8816-49506ecb6f36", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-54f1b36d-4103-4a30-8816-49506ecb6f36" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-46614478-6da8-40f1-a29a-98a8b327ab04", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-54f1b36d-4103-4a30-8816-49506ecb6f36", "state": "mounted" } ], "packages": [ "cryptsetup", "lvm2", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-54f1b36d-4103-4a30-8816-49506ecb6f36", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-54f1b36d-4103-4a30-8816-49506ecb6f36", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": "aes-xts-plain64", "encryption_key": null, "encryption_key_size": 512, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Monday 20 April 2026 16:01:39 -0400 (0:00:00.375) 0:16:43.367 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-54f1b36d-4103-4a30-8816-49506ecb6f36", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-54f1b36d-4103-4a30-8816-49506ecb6f36", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": "aes-xts-plain64", "encryption_key": null, "encryption_key_size": 512, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:152 Monday 20 April 2026 16:01:39 -0400 (0:00:00.261) 0:16:43.629 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:168 Monday 20 April 2026 16:01:39 -0400 (0:00:00.211) 0:16:43.840 ********** changed: [managed-node3] => (item={'src': '/dev/mapper/luks-46614478-6da8-40f1-a29a-98a8b327ab04', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-46614478-6da8-40f1-a29a-98a8b327ab04", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-46614478-6da8-40f1-a29a-98a8b327ab04" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:179 Monday 20 April 2026 16:01:41 -0400 (0:00:01.481) 0:16:45.321 ********** ok: [managed-node3] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:184 Monday 20 April 2026 16:01:43 -0400 (0:00:01.809) 0:16:47.131 ********** changed: [managed-node3] => (item={'src': '/dev/mapper/luks-54f1b36d-4103-4a30-8816-49506ecb6f36', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-54f1b36d-4103-4a30-8816-49506ecb6f36", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-54f1b36d-4103-4a30-8816-49506ecb6f36" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Monday 20 April 2026 16:01:44 -0400 (0:00:01.408) 0:16:48.539 ********** skipping: [managed-node3] => (item={'src': '/dev/mapper/luks-54f1b36d-4103-4a30-8816-49506ecb6f36', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-54f1b36d-4103-4a30-8816-49506ecb6f36", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:207 Monday 20 April 2026 16:01:44 -0400 (0:00:00.279) 0:16:48.819 ********** ok: [managed-node3] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:215 Monday 20 April 2026 16:01:46 -0400 (0:00:02.071) 0:16:50.890 ********** ok: [managed-node3] => { "changed": false, "stat": { "atime": 1776715170.983779, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "74f1a86f68d3edd89394751ff94ae3ee1d20ef68", "ctime": 1776715162.1507943, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 247464196, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1776715162.149794, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 85, "uid": 0, "version": "3022804676", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:220 Monday 20 April 2026 16:01:48 -0400 (0:00:01.808) 0:16:52.699 ********** changed: [managed-node3] => (item={'backing_device': '/dev/sda1', 'name': 'luks-46614478-6da8-40f1-a29a-98a8b327ab04', 'password': '-', 'state': 'absent'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda1", "name": "luks-46614478-6da8-40f1-a29a-98a8b327ab04", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed changed: [managed-node3] => (item={'backing_device': '/dev/mapper/foo-test1', 'name': 'luks-54f1b36d-4103-4a30-8816-49506ecb6f36', 'password': '-', 'state': 'present'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/mapper/foo-test1", "name": "luks-54f1b36d-4103-4a30-8816-49506ecb6f36", "password": "-", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:242 Monday 20 April 2026 16:01:51 -0400 (0:00:03.004) 0:16:55.703 ********** ok: [managed-node3] TASK [fedora.linux_system_roles.storage : Record role success fingerprint] ***** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:16 Monday 20 April 2026 16:01:53 -0400 (0:00:02.174) 0:16:57.878 ********** ok: [managed-node3] => { "changed": false } TASK [Verify role results - 7] ************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:388 Monday 20 April 2026 16:01:55 -0400 (0:00:02.028) 0:16:59.907 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node3 TASK [Print out pool information] ********************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Monday 20 April 2026 16:01:56 -0400 (0:00:00.417) 0:17:00.325 ********** ok: [managed-node3] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-54f1b36d-4103-4a30-8816-49506ecb6f36", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-54f1b36d-4103-4a30-8816-49506ecb6f36", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": "aes-xts-plain64", "encryption_key": null, "encryption_key_size": 512, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Monday 20 April 2026 16:01:56 -0400 (0:00:00.228) 0:17:00.553 ********** skipping: [managed-node3] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Monday 20 April 2026 16:01:56 -0400 (0:00:00.326) 0:17:00.880 ********** ok: [managed-node3] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "54f1b36d-4103-4a30-8816-49506ecb6f36" }, "/dev/mapper/luks-54f1b36d-4103-4a30-8816-49506ecb6f36": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-54f1b36d-4103-4a30-8816-49506ecb6f36", "size": "4G", "type": "crypt", "uuid": "23d59f11-cd37-4e03-be1c-53ab51fca6cf" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "ueM618-T4yK-lFh2-IvIn-4a1q-YHsK-ucMUoN" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Monday 20 April 2026 16:01:58 -0400 (0:00:01.438) 0:17:02.318 ********** ok: [managed-node3] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002618", "end": "2026-04-20 16:01:59.644761", "rc": 0, "start": "2026-04-20 16:01:59.642143" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-54f1b36d-4103-4a30-8816-49506ecb6f36 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Monday 20 April 2026 16:01:59 -0400 (0:00:01.570) 0:17:03.889 ********** ok: [managed-node3] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002672", "end": "2026-04-20 16:02:01.200851", "failed_when_result": false, "rc": 0, "start": "2026-04-20 16:02:01.198179" } STDOUT: luks-54f1b36d-4103-4a30-8816-49506ecb6f36 /dev/mapper/foo-test1 - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Monday 20 April 2026 16:02:01 -0400 (0:00:01.564) 0:17:05.454 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node3 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Monday 20 April 2026 16:02:02 -0400 (0:00:00.551) 0:17:06.005 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Monday 20 April 2026 16:02:02 -0400 (0:00:00.274) 0:17:06.279 ********** ok: [managed-node3] => { "changed": false, "cmd": [ "vgs", "--noheadings", "--binary", "-o", "shared", "foo" ], "delta": "0:00:00.023862", "end": "2026-04-20 16:02:03.792808", "rc": 0, "start": "2026-04-20 16:02:03.768946" } STDOUT: 0 TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Monday 20 April 2026 16:02:04 -0400 (0:00:01.688) 0:17:07.967 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Monday 20 April 2026 16:02:04 -0400 (0:00:00.393) 0:17:08.360 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node3 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node3 TASK [Set test variables] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Monday 20 April 2026 16:02:04 -0400 (0:00:00.501) 0:17:08.862 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Monday 20 April 2026 16:02:05 -0400 (0:00:00.451) 0:17:09.314 ********** ok: [managed-node3] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Monday 20 April 2026 16:02:09 -0400 (0:00:04.059) 0:17:13.373 ********** ok: [managed-node3] => { "ansible_facts": { "__pvs_lvm_len": "1" }, "changed": false } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Monday 20 April 2026 16:02:09 -0400 (0:00:00.346) 0:17:13.719 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "changed": false } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Monday 20 April 2026 16:02:10 -0400 (0:00:00.255) 0:17:13.975 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Set expected pv type] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Monday 20 April 2026 16:02:10 -0400 (0:00:00.322) 0:17:14.297 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type - 2] ************************************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Monday 20 April 2026 16:02:10 -0400 (0:00:00.260) 0:17:14.558 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type - 3] ************************************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Monday 20 April 2026 16:02:10 -0400 (0:00:00.269) 0:17:14.827 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:55 Monday 20 April 2026 16:02:11 -0400 (0:00:00.336) 0:17:15.164 ********** ok: [managed-node3] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:68 Monday 20 April 2026 16:02:11 -0400 (0:00:00.344) 0:17:15.509 ********** ok: [managed-node3] => { "changed": false, "failed_when_result": false, "rc": 1 } STDERR: Shared connection to 10.31.13.113 closed. MSG: non-zero return code TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:78 Monday 20 April 2026 16:02:13 -0400 (0:00:01.737) 0:17:17.266 ********** skipping: [managed-node3] => (item=/dev/sda) => { "ansible_loop_var": "st_pool_pv", "changed": false, "skip_reason": "Conditional result was False", "st_pool_pv": "/dev/sda" } TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:88 Monday 20 April 2026 16:02:13 -0400 (0:00:00.316) 0:17:17.583 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node3 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Monday 20 April 2026 16:02:14 -0400 (0:00:00.388) 0:17:17.971 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Monday 20 April 2026 16:02:14 -0400 (0:00:00.370) 0:17:18.342 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Monday 20 April 2026 16:02:14 -0400 (0:00:00.243) 0:17:18.585 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Monday 20 April 2026 16:02:14 -0400 (0:00:00.289) 0:17:18.875 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Monday 20 April 2026 16:02:15 -0400 (0:00:00.208) 0:17:19.083 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Monday 20 April 2026 16:02:15 -0400 (0:00:00.279) 0:17:19.363 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Monday 20 April 2026 16:02:15 -0400 (0:00:00.261) 0:17:19.624 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Monday 20 April 2026 16:02:15 -0400 (0:00:00.293) 0:17:19.918 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Monday 20 April 2026 16:02:16 -0400 (0:00:00.290) 0:17:20.209 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Monday 20 April 2026 16:02:16 -0400 (0:00:00.288) 0:17:20.497 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Monday 20 April 2026 16:02:16 -0400 (0:00:00.198) 0:17:20.696 ********** ok: [managed-node3] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:91 Monday 20 April 2026 16:02:16 -0400 (0:00:00.180) 0:17:20.876 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node3 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Monday 20 April 2026 16:02:17 -0400 (0:00:00.386) 0:17:21.263 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml for managed-node3 TASK [Get information about the LV] ******************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:8 Monday 20 April 2026 16:02:17 -0400 (0:00:00.380) 0:17:21.643 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:16 Monday 20 April 2026 16:02:18 -0400 (0:00:00.343) 0:17:21.986 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:20 Monday 20 April 2026 16:02:18 -0400 (0:00:00.310) 0:17:22.297 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV stripe size] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:27 Monday 20 April 2026 16:02:18 -0400 (0:00:00.275) 0:17:22.572 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested stripe size] ***************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:31 Monday 20 April 2026 16:02:19 -0400 (0:00:00.390) 0:17:22.963 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected stripe size] ************************************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:37 Monday 20 April 2026 16:02:19 -0400 (0:00:00.315) 0:17:23.279 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check stripe size] ******************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:42 Monday 20 April 2026 16:02:19 -0400 (0:00:00.287) 0:17:23.567 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:94 Monday 20 April 2026 16:02:19 -0400 (0:00:00.279) 0:17:23.846 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node3 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Monday 20 April 2026 16:02:20 -0400 (0:00:00.400) 0:17:24.247 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml for managed-node3 TASK [Get information about thinpool] ****************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:8 Monday 20 April 2026 16:02:20 -0400 (0:00:00.582) 0:17:24.830 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in correct thinpool (when thinp name is provided)] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:16 Monday 20 April 2026 16:02:21 -0400 (0:00:00.238) 0:17:25.068 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in thinpool (when thinp name is not provided)] ****** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:22 Monday 20 April 2026 16:02:21 -0400 (0:00:00.224) 0:17:25.293 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:26 Monday 20 April 2026 16:02:21 -0400 (0:00:00.242) 0:17:25.536 ********** ok: [managed-node3] => { "ansible_facts": { "storage_test_thin_status": null }, "changed": false } TASK [Check member encryption] ************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:97 Monday 20 April 2026 16:02:21 -0400 (0:00:00.260) 0:17:25.796 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node3 TASK [Set test variables] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Monday 20 April 2026 16:02:22 -0400 (0:00:00.642) 0:17:26.439 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Monday 20 April 2026 16:02:22 -0400 (0:00:00.286) 0:17:26.725 ********** skipping: [managed-node3] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Monday 20 April 2026 16:02:23 -0400 (0:00:00.325) 0:17:27.051 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml for managed-node3 TASK [Set variables used by tests] ********************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:2 Monday 20 April 2026 16:02:23 -0400 (0:00:00.473) 0:17:27.524 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:6 Monday 20 April 2026 16:02:23 -0400 (0:00:00.304) 0:17:27.829 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:14 Monday 20 April 2026 16:02:24 -0400 (0:00:00.409) 0:17:28.239 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:23 Monday 20 April 2026 16:02:24 -0400 (0:00:00.234) 0:17:28.473 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:32 Monday 20 April 2026 16:02:24 -0400 (0:00:00.237) 0:17:28.711 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:41 Monday 20 April 2026 16:02:24 -0400 (0:00:00.213) 0:17:28.924 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [Clear test variables] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Monday 20 April 2026 16:02:25 -0400 (0:00:00.213) 0:17:29.137 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:100 Monday 20 April 2026 16:02:25 -0400 (0:00:00.243) 0:17:29.381 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node3 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Monday 20 April 2026 16:02:26 -0400 (0:00:00.591) 0:17:29.972 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml for managed-node3 TASK [Get information about VDO deduplication] ********************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:8 Monday 20 April 2026 16:02:26 -0400 (0:00:00.466) 0:17:30.438 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:15 Monday 20 April 2026 16:02:26 -0400 (0:00:00.297) 0:17:30.736 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:21 Monday 20 April 2026 16:02:27 -0400 (0:00:00.310) 0:17:31.046 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about VDO compression] *********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:27 Monday 20 April 2026 16:02:27 -0400 (0:00:00.351) 0:17:31.398 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off - 2] *********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:34 Monday 20 April 2026 16:02:27 -0400 (0:00:00.225) 0:17:31.624 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on - 2] ************************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:40 Monday 20 April 2026 16:02:27 -0400 (0:00:00.223) 0:17:31.847 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:46 Monday 20 April 2026 16:02:28 -0400 (0:00:00.255) 0:17:32.103 ********** ok: [managed-node3] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:103 Monday 20 April 2026 16:02:28 -0400 (0:00:00.199) 0:17:32.302 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node3 TASK [Get stratis pool information] ******************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Monday 20 April 2026 16:02:28 -0400 (0:00:00.335) 0:17:32.638 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print script output] ***************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Monday 20 April 2026 16:02:28 -0400 (0:00:00.094) 0:17:32.733 ********** skipping: [managed-node3] => {} TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:19 Monday 20 April 2026 16:02:29 -0400 (0:00:00.292) 0:17:33.025 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:23 Monday 20 April 2026 16:02:29 -0400 (0:00:00.261) 0:17:33.287 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:30 Monday 20 April 2026 16:02:29 -0400 (0:00:00.233) 0:17:33.520 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:39 Monday 20 April 2026 16:02:29 -0400 (0:00:00.251) 0:17:33.772 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:49 Monday 20 April 2026 16:02:30 -0400 (0:00:00.241) 0:17:34.014 ********** ok: [managed-node3] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:106 Monday 20 April 2026 16:02:30 -0400 (0:00:00.263) 0:17:34.277 ********** ok: [managed-node3] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Monday 20 April 2026 16:02:30 -0400 (0:00:00.174) 0:17:34.452 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node3 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Monday 20 April 2026 16:02:30 -0400 (0:00:00.388) 0:17:34.840 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Monday 20 April 2026 16:02:31 -0400 (0:00:00.375) 0:17:35.215 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node3 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node3 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node3 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node3 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node3 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node3 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node3 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node3 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Monday 20 April 2026 16:02:33 -0400 (0:00:02.101) 0:17:37.316 ********** ok: [managed-node3] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-54f1b36d-4103-4a30-8816-49506ecb6f36" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Monday 20 April 2026 16:02:33 -0400 (0:00:00.324) 0:17:37.640 ********** ok: [managed-node3] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Monday 20 April 2026 16:02:34 -0400 (0:00:00.368) 0:17:38.009 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Monday 20 April 2026 16:02:34 -0400 (0:00:00.426) 0:17:38.435 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Monday 20 April 2026 16:02:34 -0400 (0:00:00.267) 0:17:38.703 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Monday 20 April 2026 16:02:35 -0400 (0:00:00.332) 0:17:39.036 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Monday 20 April 2026 16:02:35 -0400 (0:00:00.292) 0:17:39.329 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Monday 20 April 2026 16:02:35 -0400 (0:00:00.325) 0:17:39.654 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Monday 20 April 2026 16:02:35 -0400 (0:00:00.280) 0:17:39.935 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Monday 20 April 2026 16:02:36 -0400 (0:00:00.249) 0:17:40.184 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Monday 20 April 2026 16:02:36 -0400 (0:00:00.316) 0:17:40.501 ********** ok: [managed-node3] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Monday 20 April 2026 16:02:36 -0400 (0:00:00.300) 0:17:40.802 ********** ok: [managed-node3] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-54f1b36d-4103-4a30-8816-49506ecb6f36 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Monday 20 April 2026 16:02:37 -0400 (0:00:00.468) 0:17:41.270 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Monday 20 April 2026 16:02:37 -0400 (0:00:00.243) 0:17:41.514 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Monday 20 April 2026 16:02:37 -0400 (0:00:00.359) 0:17:41.873 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Monday 20 April 2026 16:02:38 -0400 (0:00:00.276) 0:17:42.149 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Monday 20 April 2026 16:02:38 -0400 (0:00:00.286) 0:17:42.435 ********** ok: [managed-node3] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Monday 20 April 2026 16:02:38 -0400 (0:00:00.333) 0:17:42.769 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Monday 20 April 2026 16:02:39 -0400 (0:00:00.422) 0:17:43.191 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Monday 20 April 2026 16:02:39 -0400 (0:00:00.402) 0:17:43.594 ********** ok: [managed-node3] => { "changed": false, "stat": { "atime": 1776715295.1395638, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1776715295.1395638, "dev": 6, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 284168, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1776715295.1395638, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Monday 20 April 2026 16:02:41 -0400 (0:00:01.825) 0:17:45.420 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Monday 20 April 2026 16:02:41 -0400 (0:00:00.263) 0:17:45.684 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Monday 20 April 2026 16:02:42 -0400 (0:00:00.265) 0:17:45.949 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Monday 20 April 2026 16:02:42 -0400 (0:00:00.280) 0:17:46.229 ********** ok: [managed-node3] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Monday 20 April 2026 16:02:42 -0400 (0:00:00.323) 0:17:46.553 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Monday 20 April 2026 16:02:42 -0400 (0:00:00.185) 0:17:46.738 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Monday 20 April 2026 16:02:43 -0400 (0:00:00.243) 0:17:46.982 ********** ok: [managed-node3] => { "changed": false, "stat": { "atime": 1776715295.3205633, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1776715295.3205633, "dev": 6, "device_type": 64769, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 284345, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1776715295.3205633, "nlink": 1, "path": "/dev/mapper/luks-54f1b36d-4103-4a30-8816-49506ecb6f36", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Monday 20 April 2026 16:02:44 -0400 (0:00:01.781) 0:17:48.763 ********** ok: [managed-node3] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Monday 20 April 2026 16:02:49 -0400 (0:00:04.848) 0:17:53.611 ********** ok: [managed-node3] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/mapper/foo-test1" ], "delta": "0:00:00.009883", "end": "2026-04-20 16:02:50.907696", "rc": 0, "start": "2026-04-20 16:02:50.897813" } STDOUT: LUKS header information for /dev/mapper/foo-test1 Version: 1 Cipher name: aes Cipher mode: xts-plain64 Hash spec: sha256 Payload offset: 16384 MK bits: 512 MK digest: 8c d5 38 46 b6 05 3a b4 fe 19 9c 40 bf a2 51 19 83 0b f4 38 MK salt: 19 7f 8e 61 2a 3f 50 35 f5 b9 af 33 15 50 23 1b 11 d2 91 b0 cb 66 20 7e d2 96 12 4c d7 0e b5 73 MK iterations: 120249 UUID: 54f1b36d-4103-4a30-8816-49506ecb6f36 Key Slot 0: ENABLED Iterations: 1909974 Salt: ab f6 07 ce 40 3f 2a 15 0d c4 87 e3 cb 43 c3 f1 08 f2 b2 cd f1 18 4a 62 f2 89 c6 da fb 5d 1e ce Key material offset: 8 AF stripes: 4000 Key Slot 1: DISABLED Key Slot 2: DISABLED Key Slot 3: DISABLED Key Slot 4: DISABLED Key Slot 5: DISABLED Key Slot 6: DISABLED Key Slot 7: DISABLED TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Monday 20 April 2026 16:02:51 -0400 (0:00:01.547) 0:17:55.158 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Monday 20 April 2026 16:02:51 -0400 (0:00:00.436) 0:17:55.595 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Monday 20 April 2026 16:02:51 -0400 (0:00:00.337) 0:17:55.932 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Monday 20 April 2026 16:02:52 -0400 (0:00:00.315) 0:17:56.248 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Monday 20 April 2026 16:02:52 -0400 (0:00:00.286) 0:17:56.534 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Monday 20 April 2026 16:02:52 -0400 (0:00:00.285) 0:17:56.819 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Monday 20 April 2026 16:02:53 -0400 (0:00:00.300) 0:17:57.119 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Set test variables] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Monday 20 April 2026 16:02:53 -0400 (0:00:00.352) 0:17:57.472 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-54f1b36d-4103-4a30-8816-49506ecb6f36 /dev/mapper/foo-test1 -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Monday 20 April 2026 16:02:53 -0400 (0:00:00.399) 0:17:57.871 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Monday 20 April 2026 16:02:54 -0400 (0:00:00.362) 0:17:58.234 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Monday 20 April 2026 16:02:54 -0400 (0:00:00.482) 0:17:58.716 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Monday 20 April 2026 16:02:55 -0400 (0:00:00.323) 0:17:59.039 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Monday 20 April 2026 16:02:55 -0400 (0:00:00.401) 0:17:59.441 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Monday 20 April 2026 16:02:55 -0400 (0:00:00.291) 0:17:59.732 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Monday 20 April 2026 16:02:56 -0400 (0:00:00.364) 0:18:00.096 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Monday 20 April 2026 16:02:56 -0400 (0:00:00.247) 0:18:00.344 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Monday 20 April 2026 16:02:56 -0400 (0:00:00.199) 0:18:00.544 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Monday 20 April 2026 16:02:56 -0400 (0:00:00.261) 0:18:00.806 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Monday 20 April 2026 16:02:57 -0400 (0:00:00.173) 0:18:00.979 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Monday 20 April 2026 16:02:57 -0400 (0:00:00.192) 0:18:01.172 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Monday 20 April 2026 16:02:57 -0400 (0:00:00.214) 0:18:01.386 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Monday 20 April 2026 16:02:57 -0400 (0:00:00.211) 0:18:01.598 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Monday 20 April 2026 16:02:57 -0400 (0:00:00.285) 0:18:01.884 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Monday 20 April 2026 16:02:58 -0400 (0:00:00.262) 0:18:02.147 ********** ok: [managed-node3] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Monday 20 April 2026 16:03:00 -0400 (0:00:02.589) 0:18:04.736 ********** ok: [managed-node3] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Monday 20 April 2026 16:03:02 -0400 (0:00:01.524) 0:18:06.261 ********** ok: [managed-node3] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [Show expected size] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Monday 20 April 2026 16:03:02 -0400 (0:00:00.369) 0:18:06.631 ********** ok: [managed-node3] => { "storage_test_expected_size": "4294967296" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Monday 20 April 2026 16:03:03 -0400 (0:00:00.360) 0:18:06.991 ********** ok: [managed-node3] => { "bytes": 10726680821, "changed": false, "lvm": "9g", "parted": "9GiB", "size": "9 GiB" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Monday 20 April 2026 16:03:04 -0400 (0:00:01.662) 0:18:08.654 ********** skipping: [managed-node3] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Monday 20 April 2026 16:03:04 -0400 (0:00:00.279) 0:18:08.933 ********** skipping: [managed-node3] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Monday 20 April 2026 16:03:05 -0400 (0:00:00.246) 0:18:09.179 ********** skipping: [managed-node3] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Monday 20 April 2026 16:03:05 -0400 (0:00:00.290) 0:18:09.469 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Monday 20 April 2026 16:03:05 -0400 (0:00:00.318) 0:18:09.788 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Monday 20 April 2026 16:03:06 -0400 (0:00:00.196) 0:18:09.985 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Monday 20 April 2026 16:03:06 -0400 (0:00:00.246) 0:18:10.231 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Monday 20 April 2026 16:03:06 -0400 (0:00:00.215) 0:18:10.447 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Monday 20 April 2026 16:03:06 -0400 (0:00:00.331) 0:18:10.779 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Monday 20 April 2026 16:03:07 -0400 (0:00:00.348) 0:18:11.127 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Monday 20 April 2026 16:03:07 -0400 (0:00:00.313) 0:18:11.440 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Monday 20 April 2026 16:03:07 -0400 (0:00:00.225) 0:18:11.666 ********** skipping: [managed-node3] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Monday 20 April 2026 16:03:07 -0400 (0:00:00.266) 0:18:11.933 ********** skipping: [managed-node3] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Monday 20 April 2026 16:03:08 -0400 (0:00:00.197) 0:18:12.130 ********** skipping: [managed-node3] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Monday 20 April 2026 16:03:08 -0400 (0:00:00.350) 0:18:12.481 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Monday 20 April 2026 16:03:08 -0400 (0:00:00.288) 0:18:12.770 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Monday 20 April 2026 16:03:09 -0400 (0:00:00.234) 0:18:13.005 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Monday 20 April 2026 16:03:09 -0400 (0:00:00.370) 0:18:13.375 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Monday 20 April 2026 16:03:09 -0400 (0:00:00.249) 0:18:13.625 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Monday 20 April 2026 16:03:09 -0400 (0:00:00.294) 0:18:13.919 ********** ok: [managed-node3] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Monday 20 April 2026 16:03:10 -0400 (0:00:00.307) 0:18:14.226 ********** ok: [managed-node3] => { "storage_test_expected_size": "4294967296" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Monday 20 April 2026 16:03:10 -0400 (0:00:00.308) 0:18:14.535 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Monday 20 April 2026 16:03:10 -0400 (0:00:00.315) 0:18:14.851 ********** ok: [managed-node3] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.025848", "end": "2026-04-20 16:03:12.380762", "rc": 0, "start": "2026-04-20 16:03:12.354914" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Monday 20 April 2026 16:03:12 -0400 (0:00:01.813) 0:18:16.664 ********** ok: [managed-node3] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [Check segment type] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Monday 20 April 2026 16:03:12 -0400 (0:00:00.192) 0:18:16.857 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Monday 20 April 2026 16:03:13 -0400 (0:00:00.337) 0:18:17.195 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Monday 20 April 2026 16:03:13 -0400 (0:00:00.324) 0:18:17.519 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Monday 20 April 2026 16:03:13 -0400 (0:00:00.266) 0:18:17.785 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Monday 20 April 2026 16:03:14 -0400 (0:00:00.267) 0:18:18.052 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Monday 20 April 2026 16:03:14 -0400 (0:00:00.323) 0:18:18.376 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Monday 20 April 2026 16:03:14 -0400 (0:00:00.302) 0:18:18.679 ********** TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Monday 20 April 2026 16:03:14 -0400 (0:00:00.182) 0:18:18.861 ********** ok: [managed-node3] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Verify preservation of encryption settings on existing LVM volume] ******* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:391 Monday 20 April 2026 16:03:15 -0400 (0:00:00.250) 0:18:19.111 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node3 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Monday 20 April 2026 16:03:15 -0400 (0:00:00.498) 0:18:19.610 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Monday 20 April 2026 16:03:15 -0400 (0:00:00.236) 0:18:19.847 ********** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Monday 20 April 2026 16:03:16 -0400 (0:00:00.248) 0:18:20.095 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Monday 20 April 2026 16:03:16 -0400 (0:00:00.211) 0:18:20.306 ********** ok: [managed-node3] TASK [fedora.linux_system_roles.storage : Record role begin fingerprint] ******* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Monday 20 April 2026 16:03:18 -0400 (0:00:01.801) 0:18:22.108 ********** ok: [managed-node3] => { "changed": false } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:13 Monday 20 April 2026 16:03:19 -0400 (0:00:01.539) 0:18:23.647 ********** skipping: [managed-node3] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node3] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node3] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node3] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:31 Monday 20 April 2026 16:03:20 -0400 (0:00:00.422) 0:18:24.070 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:36 Monday 20 April 2026 16:03:20 -0400 (0:00:00.260) 0:18:24.330 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Monday 20 April 2026 16:03:20 -0400 (0:00:00.261) 0:18:24.591 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Monday 20 April 2026 16:03:20 -0400 (0:00:00.239) 0:18:24.831 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Monday 20 April 2026 16:03:21 -0400 (0:00:00.192) 0:18:25.024 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Monday 20 April 2026 16:03:21 -0400 (0:00:00.667) 0:18:25.691 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Monday 20 April 2026 16:03:22 -0400 (0:00:00.327) 0:18:26.019 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Monday 20 April 2026 16:03:22 -0400 (0:00:00.245) 0:18:26.265 ********** ok: [managed-node3] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Monday 20 April 2026 16:03:26 -0400 (0:00:04.590) 0:18:30.855 ********** ok: [managed-node3] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Monday 20 April 2026 16:03:27 -0400 (0:00:00.176) 0:18:31.032 ********** ok: [managed-node3] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Monday 20 April 2026 16:03:27 -0400 (0:00:00.179) 0:18:31.211 ********** ok: [managed-node3] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Monday 20 April 2026 16:03:32 -0400 (0:00:05.693) 0:18:36.905 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Monday 20 April 2026 16:03:33 -0400 (0:00:00.352) 0:18:37.257 ********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Monday 20 April 2026 16:03:33 -0400 (0:00:00.165) 0:18:37.423 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Monday 20 April 2026 16:03:33 -0400 (0:00:00.270) 0:18:37.693 ********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Monday 20 April 2026 16:03:33 -0400 (0:00:00.140) 0:18:37.834 ********** ok: [managed-node3] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Monday 20 April 2026 16:03:38 -0400 (0:00:04.330) 0:18:42.164 ********** ok: [managed-node3] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luk...d6da8\\x2d40f1\\x2da29a\\x2d98a8b327ab04.service": { "name": "systemd-cryptsetup@luk...d6da8\\x2d40f1\\x2da29a\\x2d98a8b327ab04.service", "source": "systemd", "state": "inactive", "status": "generated" }, "systemd-cryptsetup@luks\\x2d46614478\\x2d6da8\\x2d40f1\\x2da29a\\x2d98a8b327ab04.service": { "name": "systemd-cryptsetup@luks\\x2d46614478\\x2d6da8\\x2d40f1\\x2da29a\\x2d98a8b327ab04.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Monday 20 April 2026 16:03:41 -0400 (0:00:03.010) 0:18:45.175 ********** changed: [managed-node3] => (item=systemd-cryptsetup@luks\x2d46614478\x2d6da8\x2d40f1\x2da29a\x2d98a8b327ab04.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d46614478\\x2d6da8\\x2d40f1\\x2da29a\\x2d98a8b327ab04.service", "name": "systemd-cryptsetup@luks\\x2d46614478\\x2d6da8\\x2d40f1\\x2da29a\\x2d98a8b327ab04.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "systemd-journald.socket system-systemd\\x2dcryptsetup.slice tmp.mount dev-sda1.device -.mount cryptsetup-pre.target", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "umount.target cryptsetup.target", "BindsTo": "dev-sda1.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-46614478-6da8-40f1-a29a-98a8b327ab04", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-46614478-6da8-40f1-a29a-98a8b327ab04 /dev/sda1 /tmp/storage_testnoe9lpp6lukskey ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-46614478-6da8-40f1-a29a-98a8b327ab04 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d46614478\\x2d6da8\\x2d40f1\\x2da29a\\x2d98a8b327ab04.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2d46614478\\x2d6da8\\x2d40f1\\x2da29a\\x2d98a8b327ab04.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2d46614478\\x2d6da8\\x2d40f1\\x2da29a\\x2d98a8b327ab04.service", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target", "Requires": "-.mount system-systemd\\x2dcryptsetup.slice", "RequiresMountsFor": "/tmp/storage_testnoe9lpp6lukskey", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Mon 2026-04-20 16:01:46 EDT", "StateChangeTimestampMonotonic": "7668631175", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "infinity", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "WantedBy": "dev-sda1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node3] => (item=systemd-cryptsetup@luk...d6da8\x2d40f1\x2da29a\x2d98a8b327ab04.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...d6da8\\x2d40f1\\x2da29a\\x2d98a8b327ab04.service", "name": "systemd-cryptsetup@luk...d6da8\\x2d40f1\\x2da29a\\x2d98a8b327ab04.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...d6da8\\x2d40f1\\x2da29a\\x2d98a8b327ab04.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...d6da8\\x2d40f1\\x2da29a\\x2d98a8b327ab04.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.NoSuchUnit \"Unit systemd-cryptsetup@luk...d6da8\\x2d40f1\\x2da29a\\x2d98a8b327ab04.service not found.\"", "LoadState": "not-found", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...d6da8\\x2d40f1\\x2da29a\\x2d98a8b327ab04.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Monday 20 April 2026 16:03:46 -0400 (0:00:04.807) 0:18:49.982 ********** ok: [managed-node3] => { "actions": [], "changed": false, "crypts": [], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-54f1b36d-4103-4a30-8816-49506ecb6f36" ], "mounts": [ { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-54f1b36d-4103-4a30-8816-49506ecb6f36", "state": "mounted" } ], "packages": [ "cryptsetup", "lvm2", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-54f1b36d-4103-4a30-8816-49506ecb6f36", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-54f1b36d-4103-4a30-8816-49506ecb6f36", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:103 Monday 20 April 2026 16:03:51 -0400 (0:00:05.931) 0:18:55.914 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:110 Monday 20 April 2026 16:03:52 -0400 (0:00:00.200) 0:18:56.114 ********** ok: [managed-node3] => { "changed": false, "stat": { "atime": 1776715304.3355477, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "e3e206cc04a4ba346b7ae87068f92cd1f09c9b31", "ctime": 1776715304.3325477, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 73400516, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1776715304.3325477, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1425, "uid": 0, "version": "319406980", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Monday 20 April 2026 16:03:53 -0400 (0:00:01.444) 0:18:57.559 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Monday 20 April 2026 16:03:53 -0400 (0:00:00.272) 0:18:57.831 ********** changed: [managed-node3] => (item=systemd-cryptsetup@luks\x2d46614478\x2d6da8\x2d40f1\x2da29a\x2d98a8b327ab04.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d46614478\\x2d6da8\\x2d40f1\\x2da29a\\x2d98a8b327ab04.service", "name": "systemd-cryptsetup@luks\\x2d46614478\\x2d6da8\\x2d40f1\\x2da29a\\x2d98a8b327ab04.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d46614478\\x2d6da8\\x2d40f1\\x2da29a\\x2d98a8b327ab04.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2d46614478\\x2d6da8\\x2d40f1\\x2da29a\\x2d98a8b327ab04.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2d46614478\\x2d6da8\\x2d40f1\\x2da29a\\x2d98a8b327ab04.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2d46614478\\x2d6da8\\x2d40f1\\x2da29a\\x2d98a8b327ab04.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node3] => (item=systemd-cryptsetup@luk...d6da8\x2d40f1\x2da29a\x2d98a8b327ab04.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...d6da8\\x2d40f1\\x2da29a\\x2d98a8b327ab04.service", "name": "systemd-cryptsetup@luk...d6da8\\x2d40f1\\x2da29a\\x2d98a8b327ab04.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...d6da8\\x2d40f1\\x2da29a\\x2d98a8b327ab04.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...d6da8\\x2d40f1\\x2da29a\\x2d98a8b327ab04.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luk...d6da8\\x2d40f1\\x2da29a\\x2d98a8b327ab04.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...d6da8\\x2d40f1\\x2da29a\\x2d98a8b327ab04.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:139 Monday 20 April 2026 16:03:58 -0400 (0:00:04.191) 0:19:02.023 ********** ok: [managed-node3] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-54f1b36d-4103-4a30-8816-49506ecb6f36" ], "mounts": [ { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-54f1b36d-4103-4a30-8816-49506ecb6f36", "state": "mounted" } ], "packages": [ "cryptsetup", "lvm2", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-54f1b36d-4103-4a30-8816-49506ecb6f36", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-54f1b36d-4103-4a30-8816-49506ecb6f36", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Monday 20 April 2026 16:03:58 -0400 (0:00:00.239) 0:19:02.262 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-54f1b36d-4103-4a30-8816-49506ecb6f36", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-54f1b36d-4103-4a30-8816-49506ecb6f36", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:152 Monday 20 April 2026 16:03:58 -0400 (0:00:00.320) 0:19:02.583 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:168 Monday 20 April 2026 16:03:58 -0400 (0:00:00.208) 0:19:02.792 ********** TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:179 Monday 20 April 2026 16:03:59 -0400 (0:00:00.304) 0:19:03.096 ********** ok: [managed-node3] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:184 Monday 20 April 2026 16:04:01 -0400 (0:00:02.172) 0:19:05.268 ********** ok: [managed-node3] => (item={'src': '/dev/mapper/luks-54f1b36d-4103-4a30-8816-49506ecb6f36', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-54f1b36d-4103-4a30-8816-49506ecb6f36", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-54f1b36d-4103-4a30-8816-49506ecb6f36" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Monday 20 April 2026 16:04:02 -0400 (0:00:01.656) 0:19:06.925 ********** skipping: [managed-node3] => (item={'src': '/dev/mapper/luks-54f1b36d-4103-4a30-8816-49506ecb6f36', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-54f1b36d-4103-4a30-8816-49506ecb6f36", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:207 Monday 20 April 2026 16:04:03 -0400 (0:00:00.252) 0:19:07.178 ********** ok: [managed-node3] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:215 Monday 20 April 2026 16:04:05 -0400 (0:00:01.934) 0:19:09.112 ********** ok: [managed-node3] => { "changed": false, "stat": { "atime": 1776715321.1985185, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "4f54d2a83e2064314b49bfe9eaf48a1de3fb8ba5", "ctime": 1776715311.5635352, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 390070420, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1776715311.5635352, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 66, "uid": 0, "version": "44895504", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:220 Monday 20 April 2026 16:04:06 -0400 (0:00:01.459) 0:19:10.572 ********** TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:242 Monday 20 April 2026 16:04:06 -0400 (0:00:00.130) 0:19:10.702 ********** ok: [managed-node3] TASK [fedora.linux_system_roles.storage : Record role success fingerprint] ***** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:16 Monday 20 April 2026 16:04:08 -0400 (0:00:01.938) 0:19:12.640 ********** ok: [managed-node3] => { "changed": false } TASK [Assert preservation of encryption settings on existing LVM volume] ******* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:404 Monday 20 April 2026 16:04:10 -0400 (0:00:01.600) 0:19:14.240 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify role results - 8] ************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:411 Monday 20 April 2026 16:04:10 -0400 (0:00:00.266) 0:19:14.507 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node3 TASK [Print out pool information] ********************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Monday 20 April 2026 16:04:10 -0400 (0:00:00.318) 0:19:14.826 ********** ok: [managed-node3] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-54f1b36d-4103-4a30-8816-49506ecb6f36", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-54f1b36d-4103-4a30-8816-49506ecb6f36", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Monday 20 April 2026 16:04:11 -0400 (0:00:00.274) 0:19:15.100 ********** skipping: [managed-node3] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Monday 20 April 2026 16:04:11 -0400 (0:00:00.177) 0:19:15.277 ********** ok: [managed-node3] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "54f1b36d-4103-4a30-8816-49506ecb6f36" }, "/dev/mapper/luks-54f1b36d-4103-4a30-8816-49506ecb6f36": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-54f1b36d-4103-4a30-8816-49506ecb6f36", "size": "4G", "type": "crypt", "uuid": "23d59f11-cd37-4e03-be1c-53ab51fca6cf" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "ueM618-T4yK-lFh2-IvIn-4a1q-YHsK-ucMUoN" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Monday 20 April 2026 16:04:12 -0400 (0:00:01.447) 0:19:16.725 ********** ok: [managed-node3] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002406", "end": "2026-04-20 16:04:13.627637", "rc": 0, "start": "2026-04-20 16:04:13.625231" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-54f1b36d-4103-4a30-8816-49506ecb6f36 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Monday 20 April 2026 16:04:13 -0400 (0:00:01.007) 0:19:17.732 ********** ok: [managed-node3] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002582", "end": "2026-04-20 16:04:14.684184", "failed_when_result": false, "rc": 0, "start": "2026-04-20 16:04:14.681602" } STDOUT: luks-54f1b36d-4103-4a30-8816-49506ecb6f36 /dev/mapper/foo-test1 - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Monday 20 April 2026 16:04:15 -0400 (0:00:01.215) 0:19:18.948 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node3 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Monday 20 April 2026 16:04:15 -0400 (0:00:00.347) 0:19:19.296 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Monday 20 April 2026 16:04:15 -0400 (0:00:00.185) 0:19:19.481 ********** ok: [managed-node3] => { "changed": false, "cmd": [ "vgs", "--noheadings", "--binary", "-o", "shared", "foo" ], "delta": "0:00:00.022891", "end": "2026-04-20 16:04:16.722430", "rc": 0, "start": "2026-04-20 16:04:16.699539" } STDOUT: 0 TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Monday 20 April 2026 16:04:17 -0400 (0:00:01.504) 0:19:20.985 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Monday 20 April 2026 16:04:17 -0400 (0:00:00.307) 0:19:21.293 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node3 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node3 TASK [Set test variables] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Monday 20 April 2026 16:04:17 -0400 (0:00:00.594) 0:19:21.887 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Monday 20 April 2026 16:04:18 -0400 (0:00:00.443) 0:19:22.331 ********** ok: [managed-node3] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Monday 20 April 2026 16:04:20 -0400 (0:00:01.673) 0:19:24.005 ********** ok: [managed-node3] => { "ansible_facts": { "__pvs_lvm_len": "1" }, "changed": false } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Monday 20 April 2026 16:04:20 -0400 (0:00:00.155) 0:19:24.160 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "changed": false } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Monday 20 April 2026 16:04:20 -0400 (0:00:00.284) 0:19:24.445 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Set expected pv type] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Monday 20 April 2026 16:04:20 -0400 (0:00:00.309) 0:19:24.754 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type - 2] ************************************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Monday 20 April 2026 16:04:21 -0400 (0:00:00.226) 0:19:24.980 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type - 3] ************************************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Monday 20 April 2026 16:04:21 -0400 (0:00:00.218) 0:19:25.198 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:55 Monday 20 April 2026 16:04:21 -0400 (0:00:00.217) 0:19:25.416 ********** ok: [managed-node3] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:68 Monday 20 April 2026 16:04:21 -0400 (0:00:00.408) 0:19:25.824 ********** ok: [managed-node3] => { "changed": false, "failed_when_result": false, "rc": 1 } STDERR: Shared connection to 10.31.13.113 closed. MSG: non-zero return code TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:78 Monday 20 April 2026 16:04:23 -0400 (0:00:01.641) 0:19:27.465 ********** skipping: [managed-node3] => (item=/dev/sda) => { "ansible_loop_var": "st_pool_pv", "changed": false, "skip_reason": "Conditional result was False", "st_pool_pv": "/dev/sda" } TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:88 Monday 20 April 2026 16:04:23 -0400 (0:00:00.261) 0:19:27.727 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node3 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Monday 20 April 2026 16:04:24 -0400 (0:00:00.273) 0:19:28.001 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Monday 20 April 2026 16:04:24 -0400 (0:00:00.131) 0:19:28.132 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Monday 20 April 2026 16:04:24 -0400 (0:00:00.163) 0:19:28.296 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Monday 20 April 2026 16:04:24 -0400 (0:00:00.145) 0:19:28.441 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Monday 20 April 2026 16:04:24 -0400 (0:00:00.210) 0:19:28.652 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Monday 20 April 2026 16:04:24 -0400 (0:00:00.230) 0:19:28.882 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Monday 20 April 2026 16:04:25 -0400 (0:00:00.204) 0:19:29.087 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Monday 20 April 2026 16:04:25 -0400 (0:00:00.298) 0:19:29.386 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Monday 20 April 2026 16:04:25 -0400 (0:00:00.278) 0:19:29.665 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Monday 20 April 2026 16:04:26 -0400 (0:00:00.381) 0:19:30.046 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Monday 20 April 2026 16:04:26 -0400 (0:00:00.253) 0:19:30.300 ********** ok: [managed-node3] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:91 Monday 20 April 2026 16:04:26 -0400 (0:00:00.292) 0:19:30.592 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node3 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Monday 20 April 2026 16:04:27 -0400 (0:00:00.473) 0:19:31.066 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml for managed-node3 TASK [Get information about the LV] ******************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:8 Monday 20 April 2026 16:04:27 -0400 (0:00:00.322) 0:19:31.388 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:16 Monday 20 April 2026 16:04:27 -0400 (0:00:00.263) 0:19:31.652 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:20 Monday 20 April 2026 16:04:27 -0400 (0:00:00.184) 0:19:31.836 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV stripe size] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:27 Monday 20 April 2026 16:04:28 -0400 (0:00:00.118) 0:19:31.955 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested stripe size] ***************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:31 Monday 20 April 2026 16:04:28 -0400 (0:00:00.148) 0:19:32.104 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected stripe size] ************************************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:37 Monday 20 April 2026 16:04:28 -0400 (0:00:00.220) 0:19:32.324 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check stripe size] ******************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:42 Monday 20 April 2026 16:04:28 -0400 (0:00:00.256) 0:19:32.581 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:94 Monday 20 April 2026 16:04:28 -0400 (0:00:00.196) 0:19:32.778 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node3 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Monday 20 April 2026 16:04:29 -0400 (0:00:00.451) 0:19:33.229 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml for managed-node3 TASK [Get information about thinpool] ****************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:8 Monday 20 April 2026 16:04:29 -0400 (0:00:00.356) 0:19:33.585 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in correct thinpool (when thinp name is provided)] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:16 Monday 20 April 2026 16:04:29 -0400 (0:00:00.193) 0:19:33.779 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in thinpool (when thinp name is not provided)] ****** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:22 Monday 20 April 2026 16:04:30 -0400 (0:00:00.176) 0:19:33.955 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:26 Monday 20 April 2026 16:04:30 -0400 (0:00:00.161) 0:19:34.116 ********** ok: [managed-node3] => { "ansible_facts": { "storage_test_thin_status": null }, "changed": false } TASK [Check member encryption] ************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:97 Monday 20 April 2026 16:04:30 -0400 (0:00:00.222) 0:19:34.339 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node3 TASK [Set test variables] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Monday 20 April 2026 16:04:30 -0400 (0:00:00.374) 0:19:34.713 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Monday 20 April 2026 16:04:30 -0400 (0:00:00.218) 0:19:34.932 ********** skipping: [managed-node3] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Monday 20 April 2026 16:04:31 -0400 (0:00:00.209) 0:19:35.142 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml for managed-node3 TASK [Set variables used by tests] ********************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:2 Monday 20 April 2026 16:04:31 -0400 (0:00:00.287) 0:19:35.429 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:6 Monday 20 April 2026 16:04:31 -0400 (0:00:00.243) 0:19:35.673 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:14 Monday 20 April 2026 16:04:31 -0400 (0:00:00.240) 0:19:35.914 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:23 Monday 20 April 2026 16:04:32 -0400 (0:00:00.247) 0:19:36.161 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:32 Monday 20 April 2026 16:04:32 -0400 (0:00:00.297) 0:19:36.459 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:41 Monday 20 April 2026 16:04:33 -0400 (0:00:00.837) 0:19:37.297 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [Clear test variables] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Monday 20 April 2026 16:04:33 -0400 (0:00:00.158) 0:19:37.455 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:100 Monday 20 April 2026 16:04:33 -0400 (0:00:00.195) 0:19:37.651 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node3 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Monday 20 April 2026 16:04:34 -0400 (0:00:00.352) 0:19:38.003 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml for managed-node3 TASK [Get information about VDO deduplication] ********************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:8 Monday 20 April 2026 16:04:34 -0400 (0:00:00.274) 0:19:38.278 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:15 Monday 20 April 2026 16:04:34 -0400 (0:00:00.214) 0:19:38.493 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:21 Monday 20 April 2026 16:04:34 -0400 (0:00:00.250) 0:19:38.743 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about VDO compression] *********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:27 Monday 20 April 2026 16:04:34 -0400 (0:00:00.174) 0:19:38.917 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off - 2] *********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:34 Monday 20 April 2026 16:04:35 -0400 (0:00:00.252) 0:19:39.170 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on - 2] ************************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:40 Monday 20 April 2026 16:04:35 -0400 (0:00:00.256) 0:19:39.427 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:46 Monday 20 April 2026 16:04:35 -0400 (0:00:00.230) 0:19:39.657 ********** ok: [managed-node3] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:103 Monday 20 April 2026 16:04:35 -0400 (0:00:00.280) 0:19:39.938 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node3 TASK [Get stratis pool information] ******************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Monday 20 April 2026 16:04:36 -0400 (0:00:00.590) 0:19:40.528 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print script output] ***************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Monday 20 April 2026 16:04:36 -0400 (0:00:00.339) 0:19:40.868 ********** skipping: [managed-node3] => {} TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:19 Monday 20 April 2026 16:04:37 -0400 (0:00:00.321) 0:19:41.189 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:23 Monday 20 April 2026 16:04:37 -0400 (0:00:00.307) 0:19:41.497 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:30 Monday 20 April 2026 16:04:37 -0400 (0:00:00.190) 0:19:41.687 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:39 Monday 20 April 2026 16:04:37 -0400 (0:00:00.238) 0:19:41.926 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:49 Monday 20 April 2026 16:04:38 -0400 (0:00:00.168) 0:19:42.094 ********** ok: [managed-node3] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:106 Monday 20 April 2026 16:04:38 -0400 (0:00:00.186) 0:19:42.280 ********** ok: [managed-node3] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Monday 20 April 2026 16:04:38 -0400 (0:00:00.182) 0:19:42.463 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node3 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Monday 20 April 2026 16:04:38 -0400 (0:00:00.330) 0:19:42.794 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Monday 20 April 2026 16:04:39 -0400 (0:00:00.261) 0:19:43.055 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node3 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node3 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node3 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node3 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node3 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node3 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node3 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node3 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Monday 20 April 2026 16:04:40 -0400 (0:00:01.225) 0:19:44.281 ********** ok: [managed-node3] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-54f1b36d-4103-4a30-8816-49506ecb6f36" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Monday 20 April 2026 16:04:40 -0400 (0:00:00.233) 0:19:44.515 ********** ok: [managed-node3] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Monday 20 April 2026 16:04:40 -0400 (0:00:00.222) 0:19:44.737 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Monday 20 April 2026 16:04:41 -0400 (0:00:00.292) 0:19:45.030 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Monday 20 April 2026 16:04:41 -0400 (0:00:00.284) 0:19:45.314 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Monday 20 April 2026 16:04:41 -0400 (0:00:00.272) 0:19:45.587 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Monday 20 April 2026 16:04:41 -0400 (0:00:00.234) 0:19:45.821 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Monday 20 April 2026 16:04:42 -0400 (0:00:00.256) 0:19:46.078 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Monday 20 April 2026 16:04:42 -0400 (0:00:00.256) 0:19:46.334 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Monday 20 April 2026 16:04:42 -0400 (0:00:00.191) 0:19:46.525 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Monday 20 April 2026 16:04:42 -0400 (0:00:00.265) 0:19:46.791 ********** ok: [managed-node3] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Monday 20 April 2026 16:04:43 -0400 (0:00:00.311) 0:19:47.102 ********** ok: [managed-node3] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-54f1b36d-4103-4a30-8816-49506ecb6f36 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Monday 20 April 2026 16:04:43 -0400 (0:00:00.552) 0:19:47.655 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Monday 20 April 2026 16:04:43 -0400 (0:00:00.274) 0:19:47.930 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Monday 20 April 2026 16:04:44 -0400 (0:00:00.223) 0:19:48.154 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Monday 20 April 2026 16:04:44 -0400 (0:00:00.227) 0:19:48.382 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Monday 20 April 2026 16:04:44 -0400 (0:00:00.262) 0:19:48.645 ********** ok: [managed-node3] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Monday 20 April 2026 16:04:44 -0400 (0:00:00.201) 0:19:48.846 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Monday 20 April 2026 16:04:45 -0400 (0:00:00.227) 0:19:49.074 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Monday 20 April 2026 16:04:45 -0400 (0:00:00.282) 0:19:49.356 ********** ok: [managed-node3] => { "changed": false, "stat": { "atime": 1776715370.9024324, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1776715295.1395638, "dev": 6, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 284168, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1776715295.1395638, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Monday 20 April 2026 16:04:46 -0400 (0:00:01.372) 0:19:50.728 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Monday 20 April 2026 16:04:47 -0400 (0:00:00.405) 0:19:51.133 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Monday 20 April 2026 16:04:47 -0400 (0:00:00.277) 0:19:51.411 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Monday 20 April 2026 16:04:47 -0400 (0:00:00.207) 0:19:51.619 ********** ok: [managed-node3] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Monday 20 April 2026 16:04:47 -0400 (0:00:00.266) 0:19:51.885 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Monday 20 April 2026 16:04:48 -0400 (0:00:00.125) 0:19:52.010 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Monday 20 April 2026 16:04:48 -0400 (0:00:00.384) 0:19:52.395 ********** ok: [managed-node3] => { "changed": false, "stat": { "atime": 1776715431.5513272, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1776715295.3205633, "dev": 6, "device_type": 64769, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 284345, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1776715295.3205633, "nlink": 1, "path": "/dev/mapper/luks-54f1b36d-4103-4a30-8816-49506ecb6f36", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Monday 20 April 2026 16:04:49 -0400 (0:00:01.550) 0:19:53.945 ********** ok: [managed-node3] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Monday 20 April 2026 16:04:53 -0400 (0:00:03.775) 0:19:57.721 ********** ok: [managed-node3] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/mapper/foo-test1" ], "delta": "0:00:00.010369", "end": "2026-04-20 16:04:54.998807", "rc": 0, "start": "2026-04-20 16:04:54.988438" } STDOUT: LUKS header information for /dev/mapper/foo-test1 Version: 1 Cipher name: aes Cipher mode: xts-plain64 Hash spec: sha256 Payload offset: 16384 MK bits: 512 MK digest: 8c d5 38 46 b6 05 3a b4 fe 19 9c 40 bf a2 51 19 83 0b f4 38 MK salt: 19 7f 8e 61 2a 3f 50 35 f5 b9 af 33 15 50 23 1b 11 d2 91 b0 cb 66 20 7e d2 96 12 4c d7 0e b5 73 MK iterations: 120249 UUID: 54f1b36d-4103-4a30-8816-49506ecb6f36 Key Slot 0: ENABLED Iterations: 1909974 Salt: ab f6 07 ce 40 3f 2a 15 0d c4 87 e3 cb 43 c3 f1 08 f2 b2 cd f1 18 4a 62 f2 89 c6 da fb 5d 1e ce Key material offset: 8 AF stripes: 4000 Key Slot 1: DISABLED Key Slot 2: DISABLED Key Slot 3: DISABLED Key Slot 4: DISABLED Key Slot 5: DISABLED Key Slot 6: DISABLED Key Slot 7: DISABLED TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Monday 20 April 2026 16:04:55 -0400 (0:00:01.566) 0:19:59.287 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Monday 20 April 2026 16:04:55 -0400 (0:00:00.362) 0:19:59.650 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Monday 20 April 2026 16:04:56 -0400 (0:00:00.371) 0:20:00.022 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Monday 20 April 2026 16:04:56 -0400 (0:00:00.437) 0:20:00.459 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Monday 20 April 2026 16:04:56 -0400 (0:00:00.361) 0:20:00.821 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Monday 20 April 2026 16:04:57 -0400 (0:00:00.521) 0:20:01.342 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Monday 20 April 2026 16:04:57 -0400 (0:00:00.289) 0:20:01.632 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Monday 20 April 2026 16:04:57 -0400 (0:00:00.262) 0:20:01.894 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-54f1b36d-4103-4a30-8816-49506ecb6f36 /dev/mapper/foo-test1 -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Monday 20 April 2026 16:04:58 -0400 (0:00:00.247) 0:20:02.142 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Monday 20 April 2026 16:04:58 -0400 (0:00:00.236) 0:20:02.379 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Monday 20 April 2026 16:04:58 -0400 (0:00:00.265) 0:20:02.645 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Monday 20 April 2026 16:04:59 -0400 (0:00:00.333) 0:20:02.978 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Monday 20 April 2026 16:04:59 -0400 (0:00:00.265) 0:20:03.243 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Monday 20 April 2026 16:04:59 -0400 (0:00:00.222) 0:20:03.465 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Monday 20 April 2026 16:04:59 -0400 (0:00:00.356) 0:20:03.822 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Monday 20 April 2026 16:05:00 -0400 (0:00:00.322) 0:20:04.145 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Monday 20 April 2026 16:05:00 -0400 (0:00:00.294) 0:20:04.439 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Monday 20 April 2026 16:05:00 -0400 (0:00:00.254) 0:20:04.693 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Monday 20 April 2026 16:05:00 -0400 (0:00:00.188) 0:20:04.881 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Monday 20 April 2026 16:05:01 -0400 (0:00:00.133) 0:20:05.015 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Monday 20 April 2026 16:05:01 -0400 (0:00:00.190) 0:20:05.205 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Monday 20 April 2026 16:05:01 -0400 (0:00:00.163) 0:20:05.369 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Monday 20 April 2026 16:05:01 -0400 (0:00:00.201) 0:20:05.570 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Monday 20 April 2026 16:05:01 -0400 (0:00:00.222) 0:20:05.793 ********** ok: [managed-node3] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Monday 20 April 2026 16:05:03 -0400 (0:00:01.517) 0:20:07.310 ********** ok: [managed-node3] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Monday 20 April 2026 16:05:04 -0400 (0:00:01.565) 0:20:08.876 ********** ok: [managed-node3] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [Show expected size] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Monday 20 April 2026 16:05:05 -0400 (0:00:00.361) 0:20:09.237 ********** ok: [managed-node3] => { "storage_test_expected_size": "4294967296" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Monday 20 April 2026 16:05:05 -0400 (0:00:00.289) 0:20:09.527 ********** ok: [managed-node3] => { "bytes": 10726680821, "changed": false, "lvm": "9g", "parted": "9GiB", "size": "9 GiB" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Monday 20 April 2026 16:05:07 -0400 (0:00:01.478) 0:20:11.006 ********** skipping: [managed-node3] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Monday 20 April 2026 16:05:07 -0400 (0:00:00.313) 0:20:11.319 ********** skipping: [managed-node3] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Monday 20 April 2026 16:05:07 -0400 (0:00:00.235) 0:20:11.554 ********** skipping: [managed-node3] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Monday 20 April 2026 16:05:07 -0400 (0:00:00.220) 0:20:11.774 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Monday 20 April 2026 16:05:08 -0400 (0:00:00.245) 0:20:12.020 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Monday 20 April 2026 16:05:08 -0400 (0:00:00.159) 0:20:12.180 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Monday 20 April 2026 16:05:08 -0400 (0:00:00.218) 0:20:12.398 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Monday 20 April 2026 16:05:08 -0400 (0:00:00.143) 0:20:12.542 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Monday 20 April 2026 16:05:08 -0400 (0:00:00.112) 0:20:12.654 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Monday 20 April 2026 16:05:08 -0400 (0:00:00.061) 0:20:12.715 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Monday 20 April 2026 16:05:09 -0400 (0:00:00.257) 0:20:12.973 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Monday 20 April 2026 16:05:09 -0400 (0:00:00.079) 0:20:13.053 ********** skipping: [managed-node3] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Monday 20 April 2026 16:05:09 -0400 (0:00:00.266) 0:20:13.320 ********** skipping: [managed-node3] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Monday 20 April 2026 16:05:09 -0400 (0:00:00.225) 0:20:13.545 ********** skipping: [managed-node3] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Monday 20 April 2026 16:05:09 -0400 (0:00:00.200) 0:20:13.746 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Monday 20 April 2026 16:05:09 -0400 (0:00:00.201) 0:20:13.947 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Monday 20 April 2026 16:05:10 -0400 (0:00:00.321) 0:20:14.268 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Monday 20 April 2026 16:05:10 -0400 (0:00:00.265) 0:20:14.534 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Monday 20 April 2026 16:05:10 -0400 (0:00:00.292) 0:20:14.826 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Monday 20 April 2026 16:05:11 -0400 (0:00:00.296) 0:20:15.123 ********** ok: [managed-node3] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Monday 20 April 2026 16:05:11 -0400 (0:00:00.251) 0:20:15.374 ********** ok: [managed-node3] => { "storage_test_expected_size": "4294967296" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Monday 20 April 2026 16:05:11 -0400 (0:00:00.278) 0:20:15.653 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Monday 20 April 2026 16:05:12 -0400 (0:00:00.309) 0:20:15.963 ********** ok: [managed-node3] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.023572", "end": "2026-04-20 16:05:13.391634", "rc": 0, "start": "2026-04-20 16:05:13.368062" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Monday 20 April 2026 16:05:13 -0400 (0:00:01.688) 0:20:17.651 ********** ok: [managed-node3] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [Check segment type] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Monday 20 April 2026 16:05:14 -0400 (0:00:00.313) 0:20:17.965 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Monday 20 April 2026 16:05:14 -0400 (0:00:00.366) 0:20:18.332 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Monday 20 April 2026 16:05:14 -0400 (0:00:00.284) 0:20:18.616 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Monday 20 April 2026 16:05:14 -0400 (0:00:00.243) 0:20:18.859 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Monday 20 April 2026 16:05:15 -0400 (0:00:00.217) 0:20:19.077 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Monday 20 April 2026 16:05:15 -0400 (0:00:00.309) 0:20:19.387 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Monday 20 April 2026 16:05:15 -0400 (0:00:00.216) 0:20:19.604 ********** TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Monday 20 April 2026 16:05:15 -0400 (0:00:00.302) 0:20:19.906 ********** ok: [managed-node3] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Monday 20 April 2026 16:05:16 -0400 (0:00:00.275) 0:20:20.181 ********** changed: [managed-node3] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode - 5] ****************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:417 Monday 20 April 2026 16:05:17 -0400 (0:00:01.668) 0:20:21.850 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node3 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Monday 20 April 2026 16:05:18 -0400 (0:00:00.415) 0:20:22.265 ********** ok: [managed-node3] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Monday 20 April 2026 16:05:18 -0400 (0:00:00.473) 0:20:22.738 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node3 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Monday 20 April 2026 16:05:19 -0400 (0:00:00.327) 0:20:23.066 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Monday 20 April 2026 16:05:19 -0400 (0:00:00.251) 0:20:23.317 ********** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Monday 20 April 2026 16:05:19 -0400 (0:00:00.323) 0:20:23.640 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Monday 20 April 2026 16:05:20 -0400 (0:00:01.165) 0:20:24.805 ********** ok: [managed-node3] TASK [fedora.linux_system_roles.storage : Record role begin fingerprint] ******* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Monday 20 April 2026 16:05:23 -0400 (0:00:02.489) 0:20:27.295 ********** ok: [managed-node3] => { "changed": false } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:13 Monday 20 April 2026 16:05:24 -0400 (0:00:01.628) 0:20:28.923 ********** skipping: [managed-node3] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node3] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node3] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node3] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:31 Monday 20 April 2026 16:05:25 -0400 (0:00:00.545) 0:20:29.468 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:36 Monday 20 April 2026 16:05:25 -0400 (0:00:00.253) 0:20:29.722 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Monday 20 April 2026 16:05:25 -0400 (0:00:00.205) 0:20:29.927 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Monday 20 April 2026 16:05:26 -0400 (0:00:00.182) 0:20:30.110 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Monday 20 April 2026 16:05:26 -0400 (0:00:00.168) 0:20:30.279 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Monday 20 April 2026 16:05:26 -0400 (0:00:00.374) 0:20:30.654 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Monday 20 April 2026 16:05:26 -0400 (0:00:00.137) 0:20:30.791 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Monday 20 April 2026 16:05:27 -0400 (0:00:00.206) 0:20:30.998 ********** ok: [managed-node3] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Monday 20 April 2026 16:05:31 -0400 (0:00:04.075) 0:20:35.074 ********** ok: [managed-node3] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": false, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Monday 20 April 2026 16:05:31 -0400 (0:00:00.304) 0:20:35.378 ********** ok: [managed-node3] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Monday 20 April 2026 16:05:31 -0400 (0:00:00.340) 0:20:35.718 ********** ok: [managed-node3] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Monday 20 April 2026 16:05:37 -0400 (0:00:05.556) 0:20:41.275 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Monday 20 April 2026 16:05:37 -0400 (0:00:00.233) 0:20:41.509 ********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Monday 20 April 2026 16:05:37 -0400 (0:00:00.131) 0:20:41.641 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Monday 20 April 2026 16:05:37 -0400 (0:00:00.141) 0:20:41.782 ********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Monday 20 April 2026 16:05:37 -0400 (0:00:00.090) 0:20:41.873 ********** ok: [managed-node3] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Monday 20 April 2026 16:05:41 -0400 (0:00:04.068) 0:20:45.942 ********** ok: [managed-node3] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luk...d4103\\x2d4a30\\x2d8816\\x2d49506ecb6f36.service": { "name": "systemd-cryptsetup@luk...d4103\\x2d4a30\\x2d8816\\x2d49506ecb6f36.service", "source": "systemd", "state": "inactive", "status": "generated" }, "systemd-cryptsetup@luks\\x2d54f1b36d\\x2d4103\\x2d4a30\\x2d8816\\x2d49506ecb6f36.service": { "name": "systemd-cryptsetup@luks\\x2d54f1b36d\\x2d4103\\x2d4a30\\x2d8816\\x2d49506ecb6f36.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Monday 20 April 2026 16:05:45 -0400 (0:00:03.067) 0:20:49.009 ********** changed: [managed-node3] => (item=systemd-cryptsetup@luks\x2d54f1b36d\x2d4103\x2d4a30\x2d8816\x2d49506ecb6f36.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d54f1b36d\\x2d4103\\x2d4a30\\x2d8816\\x2d49506ecb6f36.service", "name": "systemd-cryptsetup@luks\\x2d54f1b36d\\x2d4103\\x2d4a30\\x2d8816\\x2d49506ecb6f36.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "system-systemd\\x2dcryptsetup.slice cryptsetup-pre.target dev-mapper-foo\\x2dtest1.device systemd-journald.socket", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "umount.target cryptsetup.target", "BindsTo": "dev-mapper-foo\\x2dtest1.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-54f1b36d-4103-4a30-8816-49506ecb6f36", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-54f1b36d-4103-4a30-8816-49506ecb6f36 /dev/mapper/foo-test1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-54f1b36d-4103-4a30-8816-49506ecb6f36 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d54f1b36d\\x2d4103\\x2d4a30\\x2d8816\\x2d49506ecb6f36.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2d54f1b36d\\x2d4103\\x2d4a30\\x2d8816\\x2d49506ecb6f36.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2d54f1b36d\\x2d4103\\x2d4a30\\x2d8816\\x2d49506ecb6f36.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "dev-mapper-luks\\x2d54f1b36d\\x2d4103\\x2d4a30\\x2d8816\\x2d49506ecb6f36.device cryptsetup.target", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Mon 2026-04-20 16:03:45 EDT", "StateChangeTimestampMonotonic": "7787850292", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "infinity", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "WantedBy": "dev-mapper-foo\\x2dtest1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node3] => (item=systemd-cryptsetup@luk...d4103\x2d4a30\x2d8816\x2d49506ecb6f36.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...d4103\\x2d4a30\\x2d8816\\x2d49506ecb6f36.service", "name": "systemd-cryptsetup@luk...d4103\\x2d4a30\\x2d8816\\x2d49506ecb6f36.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...d4103\\x2d4a30\\x2d8816\\x2d49506ecb6f36.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...d4103\\x2d4a30\\x2d8816\\x2d49506ecb6f36.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.NoSuchUnit \"Unit systemd-cryptsetup@luk...d4103\\x2d4a30\\x2d8816\\x2d49506ecb6f36.service not found.\"", "LoadState": "not-found", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...d4103\\x2d4a30\\x2d8816\\x2d49506ecb6f36.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Monday 20 April 2026 16:05:48 -0400 (0:00:03.758) 0:20:52.768 ********** fatal: [managed-node3]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'luks-54f1b36d-4103-4a30-8816-49506ecb6f36' in safe mode due to encryption removal TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129 Monday 20 April 2026 16:05:54 -0400 (0:00:05.534) 0:20:58.303 ********** fatal: [managed-node3]: FAILED! => { "changed": false } MSG: {'msg': "cannot remove existing formatting on device 'luks-54f1b36d-4103-4a30-8816-49506ecb6f36' in safe mode due to encryption removal", 'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'invocation': {'module_args': {'pools': [{'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'lvm', 'volumes': [{'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks1', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None}]}], 'volumes': [], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': True, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Monday 20 April 2026 16:05:54 -0400 (0:00:00.228) 0:20:58.531 ********** changed: [managed-node3] => (item=systemd-cryptsetup@luks\x2d54f1b36d\x2d4103\x2d4a30\x2d8816\x2d49506ecb6f36.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d54f1b36d\\x2d4103\\x2d4a30\\x2d8816\\x2d49506ecb6f36.service", "name": "systemd-cryptsetup@luks\\x2d54f1b36d\\x2d4103\\x2d4a30\\x2d8816\\x2d49506ecb6f36.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d54f1b36d\\x2d4103\\x2d4a30\\x2d8816\\x2d49506ecb6f36.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2d54f1b36d\\x2d4103\\x2d4a30\\x2d8816\\x2d49506ecb6f36.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2d54f1b36d\\x2d4103\\x2d4a30\\x2d8816\\x2d49506ecb6f36.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2d54f1b36d\\x2d4103\\x2d4a30\\x2d8816\\x2d49506ecb6f36.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "RequiredBy": "dev-mapper-luks\\x2d54f1b36d\\x2d4103\\x2d4a30\\x2d8816\\x2d49506ecb6f36.device cryptsetup.target", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Mon 2026-04-20 16:03:45 EDT", "StateChangeTimestampMonotonic": "7787850292", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WantedBy": "dev-mapper-foo\\x2dtest1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node3] => (item=systemd-cryptsetup@luk...d4103\x2d4a30\x2d8816\x2d49506ecb6f36.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...d4103\\x2d4a30\\x2d8816\\x2d49506ecb6f36.service", "name": "systemd-cryptsetup@luk...d4103\\x2d4a30\\x2d8816\\x2d49506ecb6f36.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...d4103\\x2d4a30\\x2d8816\\x2d49506ecb6f36.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...d4103\\x2d4a30\\x2d8816\\x2d49506ecb6f36.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luk...d4103\\x2d4a30\\x2d8816\\x2d49506ecb6f36.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...d4103\\x2d4a30\\x2d8816\\x2d49506ecb6f36.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:22 Monday 20 April 2026 16:05:58 -0400 (0:00:03.523) 0:21:02.055 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:27 Monday 20 April 2026 16:05:58 -0400 (0:00:00.280) 0:21:02.335 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:38 Monday 20 April 2026 16:05:58 -0400 (0:00:00.462) 0:21:02.798 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Monday 20 April 2026 16:05:59 -0400 (0:00:00.297) 0:21:03.095 ********** ok: [managed-node3] => { "changed": false, "stat": { "atime": 1776715517.5781782, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1776715517.5781782, "dev": 64769, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1776715517.5781782, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "970895500", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Monday 20 April 2026 16:06:00 -0400 (0:00:01.320) 0:21:04.416 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Remove the encryption layer - 3] ***************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:440 Monday 20 April 2026 16:06:00 -0400 (0:00:00.208) 0:21:04.625 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node3 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Monday 20 April 2026 16:06:01 -0400 (0:00:00.444) 0:21:05.069 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Monday 20 April 2026 16:06:01 -0400 (0:00:00.229) 0:21:05.299 ********** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Monday 20 April 2026 16:06:01 -0400 (0:00:00.341) 0:21:05.640 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Monday 20 April 2026 16:06:01 -0400 (0:00:00.250) 0:21:05.891 ********** ok: [managed-node3] TASK [fedora.linux_system_roles.storage : Record role begin fingerprint] ******* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Monday 20 April 2026 16:06:04 -0400 (0:00:02.401) 0:21:08.292 ********** ok: [managed-node3] => { "changed": false } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:13 Monday 20 April 2026 16:06:05 -0400 (0:00:01.370) 0:21:09.663 ********** skipping: [managed-node3] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node3] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node3] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node3] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:31 Monday 20 April 2026 16:06:06 -0400 (0:00:00.557) 0:21:10.221 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:36 Monday 20 April 2026 16:06:06 -0400 (0:00:00.267) 0:21:10.488 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Monday 20 April 2026 16:06:06 -0400 (0:00:00.214) 0:21:10.703 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Monday 20 April 2026 16:06:06 -0400 (0:00:00.080) 0:21:10.783 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Monday 20 April 2026 16:06:06 -0400 (0:00:00.098) 0:21:10.882 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Monday 20 April 2026 16:06:07 -0400 (0:00:00.571) 0:21:11.454 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Monday 20 April 2026 16:06:07 -0400 (0:00:00.152) 0:21:11.606 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Monday 20 April 2026 16:06:07 -0400 (0:00:00.234) 0:21:11.841 ********** ok: [managed-node3] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Monday 20 April 2026 16:06:12 -0400 (0:00:04.409) 0:21:16.251 ********** ok: [managed-node3] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": false, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Monday 20 April 2026 16:06:12 -0400 (0:00:00.270) 0:21:16.521 ********** ok: [managed-node3] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Monday 20 April 2026 16:06:12 -0400 (0:00:00.365) 0:21:16.886 ********** ok: [managed-node3] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Monday 20 April 2026 16:06:18 -0400 (0:00:05.847) 0:21:22.734 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Monday 20 April 2026 16:06:19 -0400 (0:00:00.265) 0:21:23.000 ********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Monday 20 April 2026 16:06:19 -0400 (0:00:00.158) 0:21:23.159 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Monday 20 April 2026 16:06:19 -0400 (0:00:00.135) 0:21:23.294 ********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Monday 20 April 2026 16:06:19 -0400 (0:00:00.162) 0:21:23.457 ********** ok: [managed-node3] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Monday 20 April 2026 16:06:23 -0400 (0:00:04.242) 0:21:27.699 ********** ok: [managed-node3] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luk...d4103\\x2d4a30\\x2d8816\\x2d49506ecb6f36.service": { "name": "systemd-cryptsetup@luk...d4103\\x2d4a30\\x2d8816\\x2d49506ecb6f36.service", "source": "systemd", "state": "inactive", "status": "generated" }, "systemd-cryptsetup@luks\\x2d54f1b36d\\x2d4103\\x2d4a30\\x2d8816\\x2d49506ecb6f36.service": { "name": "systemd-cryptsetup@luks\\x2d54f1b36d\\x2d4103\\x2d4a30\\x2d8816\\x2d49506ecb6f36.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Monday 20 April 2026 16:06:26 -0400 (0:00:02.582) 0:21:30.282 ********** changed: [managed-node3] => (item=systemd-cryptsetup@luks\x2d54f1b36d\x2d4103\x2d4a30\x2d8816\x2d49506ecb6f36.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d54f1b36d\\x2d4103\\x2d4a30\\x2d8816\\x2d49506ecb6f36.service", "name": "systemd-cryptsetup@luks\\x2d54f1b36d\\x2d4103\\x2d4a30\\x2d8816\\x2d49506ecb6f36.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "system-systemd\\x2dcryptsetup.slice dev-mapper-foo\\x2dtest1.device systemd-journald.socket cryptsetup-pre.target", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "umount.target cryptsetup.target", "BindsTo": "dev-mapper-foo\\x2dtest1.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-54f1b36d-4103-4a30-8816-49506ecb6f36", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-54f1b36d-4103-4a30-8816-49506ecb6f36 /dev/mapper/foo-test1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-54f1b36d-4103-4a30-8816-49506ecb6f36 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d54f1b36d\\x2d4103\\x2d4a30\\x2d8816\\x2d49506ecb6f36.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2d54f1b36d\\x2d4103\\x2d4a30\\x2d8816\\x2d49506ecb6f36.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2d54f1b36d\\x2d4103\\x2d4a30\\x2d8816\\x2d49506ecb6f36.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "dev-mapper-luks\\x2d54f1b36d\\x2d4103\\x2d4a30\\x2d8816\\x2d49506ecb6f36.device cryptsetup.target", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Mon 2026-04-20 16:03:45 EDT", "StateChangeTimestampMonotonic": "7787850292", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "infinity", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "WantedBy": "dev-mapper-foo\\x2dtest1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node3] => (item=systemd-cryptsetup@luk...d4103\x2d4a30\x2d8816\x2d49506ecb6f36.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...d4103\\x2d4a30\\x2d8816\\x2d49506ecb6f36.service", "name": "systemd-cryptsetup@luk...d4103\\x2d4a30\\x2d8816\\x2d49506ecb6f36.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...d4103\\x2d4a30\\x2d8816\\x2d49506ecb6f36.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...d4103\\x2d4a30\\x2d8816\\x2d49506ecb6f36.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.NoSuchUnit \"Unit systemd-cryptsetup@luk...d4103\\x2d4a30\\x2d8816\\x2d49506ecb6f36.service not found.\"", "LoadState": "not-found", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...d4103\\x2d4a30\\x2d8816\\x2d49506ecb6f36.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Monday 20 April 2026 16:06:30 -0400 (0:00:04.162) 0:21:34.445 ********** changed: [managed-node3] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-54f1b36d-4103-4a30-8816-49506ecb6f36", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-54f1b36d-4103-4a30-8816-49506ecb6f36", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-54f1b36d-4103-4a30-8816-49506ecb6f36", "password": "-", "state": "absent" } ], "leaves": [ "/dev/mapper/foo-test1", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-54f1b36d-4103-4a30-8816-49506ecb6f36", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "lvm2", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:103 Monday 20 April 2026 16:06:36 -0400 (0:00:06.359) 0:21:40.804 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:110 Monday 20 April 2026 16:06:37 -0400 (0:00:00.185) 0:21:40.990 ********** ok: [managed-node3] => { "changed": false, "stat": { "atime": 1776715304.3355477, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "e3e206cc04a4ba346b7ae87068f92cd1f09c9b31", "ctime": 1776715304.3325477, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 73400516, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1776715304.3325477, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1425, "uid": 0, "version": "319406980", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Monday 20 April 2026 16:06:38 -0400 (0:00:01.423) 0:21:42.413 ********** ok: [managed-node3] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Monday 20 April 2026 16:06:40 -0400 (0:00:01.833) 0:21:44.247 ********** changed: [managed-node3] => (item=systemd-cryptsetup@luks\x2d54f1b36d\x2d4103\x2d4a30\x2d8816\x2d49506ecb6f36.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d54f1b36d\\x2d4103\\x2d4a30\\x2d8816\\x2d49506ecb6f36.service", "name": "systemd-cryptsetup@luks\\x2d54f1b36d\\x2d4103\\x2d4a30\\x2d8816\\x2d49506ecb6f36.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d54f1b36d\\x2d4103\\x2d4a30\\x2d8816\\x2d49506ecb6f36.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2d54f1b36d\\x2d4103\\x2d4a30\\x2d8816\\x2d49506ecb6f36.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2d54f1b36d\\x2d4103\\x2d4a30\\x2d8816\\x2d49506ecb6f36.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2d54f1b36d\\x2d4103\\x2d4a30\\x2d8816\\x2d49506ecb6f36.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "RequiredBy": "dev-mapper-luks\\x2d54f1b36d\\x2d4103\\x2d4a30\\x2d8816\\x2d49506ecb6f36.device cryptsetup.target", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Mon 2026-04-20 16:03:45 EDT", "StateChangeTimestampMonotonic": "7787850292", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WantedBy": "dev-mapper-foo\\x2dtest1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node3] => (item=systemd-cryptsetup@luk...d4103\x2d4a30\x2d8816\x2d49506ecb6f36.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...d4103\\x2d4a30\\x2d8816\\x2d49506ecb6f36.service", "name": "systemd-cryptsetup@luk...d4103\\x2d4a30\\x2d8816\\x2d49506ecb6f36.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...d4103\\x2d4a30\\x2d8816\\x2d49506ecb6f36.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...d4103\\x2d4a30\\x2d8816\\x2d49506ecb6f36.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luk...d4103\\x2d4a30\\x2d8816\\x2d49506ecb6f36.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...d4103\\x2d4a30\\x2d8816\\x2d49506ecb6f36.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:139 Monday 20 April 2026 16:06:45 -0400 (0:00:04.794) 0:21:49.042 ********** ok: [managed-node3] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-54f1b36d-4103-4a30-8816-49506ecb6f36", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-54f1b36d-4103-4a30-8816-49506ecb6f36", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-54f1b36d-4103-4a30-8816-49506ecb6f36", "password": "-", "state": "absent" } ], "failed": false, "leaves": [ "/dev/mapper/foo-test1", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-54f1b36d-4103-4a30-8816-49506ecb6f36", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "lvm2", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Monday 20 April 2026 16:06:45 -0400 (0:00:00.200) 0:21:49.243 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:152 Monday 20 April 2026 16:06:45 -0400 (0:00:00.287) 0:21:49.530 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:168 Monday 20 April 2026 16:06:45 -0400 (0:00:00.329) 0:21:49.860 ********** changed: [managed-node3] => (item={'src': '/dev/mapper/luks-54f1b36d-4103-4a30-8816-49506ecb6f36', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-54f1b36d-4103-4a30-8816-49506ecb6f36", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-54f1b36d-4103-4a30-8816-49506ecb6f36" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:179 Monday 20 April 2026 16:06:47 -0400 (0:00:01.540) 0:21:51.400 ********** ok: [managed-node3] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:184 Monday 20 April 2026 16:06:49 -0400 (0:00:01.916) 0:21:53.317 ********** changed: [managed-node3] => (item={'src': '/dev/mapper/foo-test1', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Monday 20 April 2026 16:06:51 -0400 (0:00:01.902) 0:21:55.219 ********** skipping: [managed-node3] => (item={'src': '/dev/mapper/foo-test1', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:207 Monday 20 April 2026 16:06:51 -0400 (0:00:00.362) 0:21:55.582 ********** ok: [managed-node3] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:215 Monday 20 April 2026 16:06:53 -0400 (0:00:01.929) 0:21:57.511 ********** ok: [managed-node3] => { "changed": false, "stat": { "atime": 1776715321.1985185, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "4f54d2a83e2064314b49bfe9eaf48a1de3fb8ba5", "ctime": 1776715311.5635352, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 390070420, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1776715311.5635352, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 66, "uid": 0, "version": "44895504", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:220 Monday 20 April 2026 16:06:54 -0400 (0:00:01.368) 0:21:58.879 ********** changed: [managed-node3] => (item={'backing_device': '/dev/mapper/foo-test1', 'name': 'luks-54f1b36d-4103-4a30-8816-49506ecb6f36', 'password': '-', 'state': 'absent'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/mapper/foo-test1", "name": "luks-54f1b36d-4103-4a30-8816-49506ecb6f36", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:242 Monday 20 April 2026 16:06:56 -0400 (0:00:01.669) 0:22:00.549 ********** ok: [managed-node3] TASK [fedora.linux_system_roles.storage : Record role success fingerprint] ***** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:16 Monday 20 April 2026 16:06:58 -0400 (0:00:01.868) 0:22:02.418 ********** ok: [managed-node3] => { "changed": false } TASK [Verify role results - 9] ************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:455 Monday 20 April 2026 16:06:59 -0400 (0:00:01.396) 0:22:03.815 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node3 TASK [Print out pool information] ********************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Monday 20 April 2026 16:07:00 -0400 (0:00:00.471) 0:22:04.286 ********** ok: [managed-node3] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Monday 20 April 2026 16:07:00 -0400 (0:00:00.340) 0:22:04.626 ********** skipping: [managed-node3] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Monday 20 April 2026 16:07:00 -0400 (0:00:00.257) 0:22:04.884 ********** ok: [managed-node3] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "1d940550-c8f2-422f-a05b-9362b15917d5" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "ueM618-T4yK-lFh2-IvIn-4a1q-YHsK-ucMUoN" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Monday 20 April 2026 16:07:02 -0400 (0:00:01.764) 0:22:06.648 ********** ok: [managed-node3] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002647", "end": "2026-04-20 16:07:04.261431", "rc": 0, "start": "2026-04-20 16:07:04.258784" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/foo-test1 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Monday 20 April 2026 16:07:04 -0400 (0:00:01.827) 0:22:08.476 ********** ok: [managed-node3] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002661", "end": "2026-04-20 16:07:06.043203", "failed_when_result": false, "rc": 0, "start": "2026-04-20 16:07:06.040542" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Monday 20 April 2026 16:07:06 -0400 (0:00:01.865) 0:22:10.341 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node3 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Monday 20 April 2026 16:07:06 -0400 (0:00:00.365) 0:22:10.706 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Monday 20 April 2026 16:07:06 -0400 (0:00:00.138) 0:22:10.845 ********** ok: [managed-node3] => { "changed": false, "cmd": [ "vgs", "--noheadings", "--binary", "-o", "shared", "foo" ], "delta": "0:00:00.024420", "end": "2026-04-20 16:07:08.392707", "rc": 0, "start": "2026-04-20 16:07:08.368287" } STDOUT: 0 TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Monday 20 April 2026 16:07:08 -0400 (0:00:01.746) 0:22:12.591 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Monday 20 April 2026 16:07:09 -0400 (0:00:00.382) 0:22:12.974 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node3 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node3 TASK [Set test variables] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Monday 20 April 2026 16:07:09 -0400 (0:00:00.453) 0:22:13.427 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Monday 20 April 2026 16:07:09 -0400 (0:00:00.348) 0:22:13.776 ********** ok: [managed-node3] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Monday 20 April 2026 16:07:11 -0400 (0:00:01.414) 0:22:15.191 ********** ok: [managed-node3] => { "ansible_facts": { "__pvs_lvm_len": "1" }, "changed": false } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Monday 20 April 2026 16:07:11 -0400 (0:00:00.282) 0:22:15.474 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "changed": false } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Monday 20 April 2026 16:07:11 -0400 (0:00:00.360) 0:22:15.835 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Set expected pv type] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Monday 20 April 2026 16:07:12 -0400 (0:00:00.300) 0:22:16.135 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type - 2] ************************************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Monday 20 April 2026 16:07:12 -0400 (0:00:00.272) 0:22:16.408 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type - 3] ************************************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Monday 20 April 2026 16:07:12 -0400 (0:00:00.212) 0:22:16.620 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:55 Monday 20 April 2026 16:07:12 -0400 (0:00:00.267) 0:22:16.888 ********** ok: [managed-node3] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:68 Monday 20 April 2026 16:07:13 -0400 (0:00:00.324) 0:22:17.212 ********** ok: [managed-node3] => { "changed": false, "failed_when_result": false, "rc": 1 } STDERR: Shared connection to 10.31.13.113 closed. MSG: non-zero return code TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:78 Monday 20 April 2026 16:07:14 -0400 (0:00:01.348) 0:22:18.560 ********** skipping: [managed-node3] => (item=/dev/sda) => { "ansible_loop_var": "st_pool_pv", "changed": false, "skip_reason": "Conditional result was False", "st_pool_pv": "/dev/sda" } TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:88 Monday 20 April 2026 16:07:14 -0400 (0:00:00.268) 0:22:18.828 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node3 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Monday 20 April 2026 16:07:15 -0400 (0:00:00.487) 0:22:19.316 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Monday 20 April 2026 16:07:15 -0400 (0:00:00.281) 0:22:19.597 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Monday 20 April 2026 16:07:15 -0400 (0:00:00.350) 0:22:19.947 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Monday 20 April 2026 16:07:16 -0400 (0:00:00.376) 0:22:20.323 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Monday 20 April 2026 16:07:16 -0400 (0:00:00.281) 0:22:20.605 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Monday 20 April 2026 16:07:16 -0400 (0:00:00.259) 0:22:20.865 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Monday 20 April 2026 16:07:17 -0400 (0:00:00.286) 0:22:21.151 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Monday 20 April 2026 16:07:17 -0400 (0:00:00.190) 0:22:21.342 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Monday 20 April 2026 16:07:17 -0400 (0:00:00.227) 0:22:21.571 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Monday 20 April 2026 16:07:17 -0400 (0:00:00.329) 0:22:21.901 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Monday 20 April 2026 16:07:18 -0400 (0:00:00.298) 0:22:22.199 ********** ok: [managed-node3] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:91 Monday 20 April 2026 16:07:18 -0400 (0:00:00.312) 0:22:22.512 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node3 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Monday 20 April 2026 16:07:19 -0400 (0:00:00.535) 0:22:23.047 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml for managed-node3 TASK [Get information about the LV] ******************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:8 Monday 20 April 2026 16:07:19 -0400 (0:00:00.382) 0:22:23.429 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:16 Monday 20 April 2026 16:07:19 -0400 (0:00:00.259) 0:22:23.689 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:20 Monday 20 April 2026 16:07:19 -0400 (0:00:00.170) 0:22:23.859 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV stripe size] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:27 Monday 20 April 2026 16:07:20 -0400 (0:00:00.290) 0:22:24.150 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested stripe size] ***************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:31 Monday 20 April 2026 16:07:20 -0400 (0:00:00.252) 0:22:24.403 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected stripe size] ************************************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:37 Monday 20 April 2026 16:07:20 -0400 (0:00:00.278) 0:22:24.681 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check stripe size] ******************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:42 Monday 20 April 2026 16:07:20 -0400 (0:00:00.218) 0:22:24.899 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:94 Monday 20 April 2026 16:07:21 -0400 (0:00:00.223) 0:22:25.123 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node3 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Monday 20 April 2026 16:07:21 -0400 (0:00:00.445) 0:22:25.569 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml for managed-node3 TASK [Get information about thinpool] ****************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:8 Monday 20 April 2026 16:07:21 -0400 (0:00:00.279) 0:22:25.848 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in correct thinpool (when thinp name is provided)] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:16 Monday 20 April 2026 16:07:22 -0400 (0:00:00.239) 0:22:26.088 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in thinpool (when thinp name is not provided)] ****** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:22 Monday 20 April 2026 16:07:22 -0400 (0:00:00.437) 0:22:26.525 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:26 Monday 20 April 2026 16:07:22 -0400 (0:00:00.199) 0:22:26.725 ********** ok: [managed-node3] => { "ansible_facts": { "storage_test_thin_status": null }, "changed": false } TASK [Check member encryption] ************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:97 Monday 20 April 2026 16:07:23 -0400 (0:00:00.316) 0:22:27.042 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node3 TASK [Set test variables] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Monday 20 April 2026 16:07:23 -0400 (0:00:00.640) 0:22:27.682 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Monday 20 April 2026 16:07:23 -0400 (0:00:00.257) 0:22:27.940 ********** skipping: [managed-node3] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Monday 20 April 2026 16:07:24 -0400 (0:00:00.359) 0:22:28.300 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml for managed-node3 TASK [Set variables used by tests] ********************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:2 Monday 20 April 2026 16:07:24 -0400 (0:00:00.524) 0:22:28.824 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:6 Monday 20 April 2026 16:07:25 -0400 (0:00:00.357) 0:22:29.182 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:14 Monday 20 April 2026 16:07:25 -0400 (0:00:00.363) 0:22:29.545 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:23 Monday 20 April 2026 16:07:25 -0400 (0:00:00.337) 0:22:29.883 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:32 Monday 20 April 2026 16:07:26 -0400 (0:00:00.338) 0:22:30.221 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:41 Monday 20 April 2026 16:07:26 -0400 (0:00:00.337) 0:22:30.559 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [Clear test variables] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Monday 20 April 2026 16:07:26 -0400 (0:00:00.269) 0:22:30.828 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:100 Monday 20 April 2026 16:07:27 -0400 (0:00:00.311) 0:22:31.139 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node3 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Monday 20 April 2026 16:07:27 -0400 (0:00:00.764) 0:22:31.904 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml for managed-node3 TASK [Get information about VDO deduplication] ********************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:8 Monday 20 April 2026 16:07:29 -0400 (0:00:01.287) 0:22:33.192 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:15 Monday 20 April 2026 16:07:29 -0400 (0:00:00.198) 0:22:33.391 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:21 Monday 20 April 2026 16:07:29 -0400 (0:00:00.241) 0:22:33.632 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about VDO compression] *********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:27 Monday 20 April 2026 16:07:29 -0400 (0:00:00.276) 0:22:33.909 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off - 2] *********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:34 Monday 20 April 2026 16:07:30 -0400 (0:00:00.345) 0:22:34.255 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on - 2] ************************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:40 Monday 20 April 2026 16:07:30 -0400 (0:00:00.277) 0:22:34.532 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:46 Monday 20 April 2026 16:07:30 -0400 (0:00:00.270) 0:22:34.802 ********** ok: [managed-node3] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:103 Monday 20 April 2026 16:07:31 -0400 (0:00:00.322) 0:22:35.125 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node3 TASK [Get stratis pool information] ******************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Monday 20 April 2026 16:07:31 -0400 (0:00:00.500) 0:22:35.625 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print script output] ***************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Monday 20 April 2026 16:07:31 -0400 (0:00:00.304) 0:22:35.930 ********** skipping: [managed-node3] => {} TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:19 Monday 20 April 2026 16:07:32 -0400 (0:00:00.266) 0:22:36.197 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:23 Monday 20 April 2026 16:07:32 -0400 (0:00:00.203) 0:22:36.401 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:30 Monday 20 April 2026 16:07:32 -0400 (0:00:00.398) 0:22:36.799 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:39 Monday 20 April 2026 16:07:33 -0400 (0:00:00.259) 0:22:37.059 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:49 Monday 20 April 2026 16:07:33 -0400 (0:00:00.256) 0:22:37.315 ********** ok: [managed-node3] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:106 Monday 20 April 2026 16:07:33 -0400 (0:00:00.224) 0:22:37.539 ********** ok: [managed-node3] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Monday 20 April 2026 16:07:33 -0400 (0:00:00.210) 0:22:37.750 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node3 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Monday 20 April 2026 16:07:34 -0400 (0:00:00.407) 0:22:38.158 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Monday 20 April 2026 16:07:34 -0400 (0:00:00.203) 0:22:38.362 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node3 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node3 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node3 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node3 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node3 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node3 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node3 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node3 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Monday 20 April 2026 16:07:35 -0400 (0:00:01.392) 0:22:39.754 ********** ok: [managed-node3] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Monday 20 April 2026 16:07:36 -0400 (0:00:00.298) 0:22:40.053 ********** ok: [managed-node3] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Monday 20 April 2026 16:07:36 -0400 (0:00:00.511) 0:22:40.564 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Monday 20 April 2026 16:07:37 -0400 (0:00:00.462) 0:22:41.027 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Monday 20 April 2026 16:07:37 -0400 (0:00:00.304) 0:22:41.331 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Monday 20 April 2026 16:07:37 -0400 (0:00:00.383) 0:22:41.715 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Monday 20 April 2026 16:07:38 -0400 (0:00:00.356) 0:22:42.071 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Monday 20 April 2026 16:07:38 -0400 (0:00:00.342) 0:22:42.414 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Monday 20 April 2026 16:07:38 -0400 (0:00:00.216) 0:22:42.631 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Monday 20 April 2026 16:07:39 -0400 (0:00:00.360) 0:22:42.992 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Monday 20 April 2026 16:07:39 -0400 (0:00:00.274) 0:22:43.267 ********** ok: [managed-node3] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Monday 20 April 2026 16:07:39 -0400 (0:00:00.278) 0:22:43.545 ********** ok: [managed-node3] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/foo-test1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Monday 20 April 2026 16:07:40 -0400 (0:00:00.649) 0:22:44.194 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Monday 20 April 2026 16:07:40 -0400 (0:00:00.309) 0:22:44.504 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Monday 20 April 2026 16:07:40 -0400 (0:00:00.231) 0:22:44.736 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Monday 20 April 2026 16:07:40 -0400 (0:00:00.201) 0:22:44.938 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Monday 20 April 2026 16:07:41 -0400 (0:00:00.355) 0:22:45.294 ********** ok: [managed-node3] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Monday 20 April 2026 16:07:41 -0400 (0:00:00.308) 0:22:45.602 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Monday 20 April 2026 16:07:42 -0400 (0:00:00.393) 0:22:45.995 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Monday 20 April 2026 16:07:42 -0400 (0:00:00.354) 0:22:46.350 ********** ok: [managed-node3] => { "changed": false, "stat": { "atime": 1776715596.4970415, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1776715596.4970415, "dev": 6, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 318516, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1776715596.4970415, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Monday 20 April 2026 16:07:44 -0400 (0:00:01.672) 0:22:48.022 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Monday 20 April 2026 16:07:44 -0400 (0:00:00.329) 0:22:48.351 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Monday 20 April 2026 16:07:44 -0400 (0:00:00.281) 0:22:48.633 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Monday 20 April 2026 16:07:44 -0400 (0:00:00.308) 0:22:48.941 ********** ok: [managed-node3] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Monday 20 April 2026 16:07:45 -0400 (0:00:00.326) 0:22:49.268 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Monday 20 April 2026 16:07:45 -0400 (0:00:00.338) 0:22:49.606 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Monday 20 April 2026 16:07:45 -0400 (0:00:00.279) 0:22:49.885 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Monday 20 April 2026 16:07:46 -0400 (0:00:00.184) 0:22:50.070 ********** ok: [managed-node3] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Monday 20 April 2026 16:07:50 -0400 (0:00:04.051) 0:22:54.121 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Monday 20 April 2026 16:07:50 -0400 (0:00:00.248) 0:22:54.370 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Monday 20 April 2026 16:07:50 -0400 (0:00:00.235) 0:22:54.606 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Monday 20 April 2026 16:07:50 -0400 (0:00:00.265) 0:22:54.871 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Monday 20 April 2026 16:07:51 -0400 (0:00:00.215) 0:22:55.086 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Monday 20 April 2026 16:07:51 -0400 (0:00:00.239) 0:22:55.326 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Monday 20 April 2026 16:07:51 -0400 (0:00:00.204) 0:22:55.531 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Monday 20 April 2026 16:07:51 -0400 (0:00:00.133) 0:22:55.664 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Monday 20 April 2026 16:07:51 -0400 (0:00:00.188) 0:22:55.852 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Monday 20 April 2026 16:07:52 -0400 (0:00:00.300) 0:22:56.152 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Monday 20 April 2026 16:07:52 -0400 (0:00:00.232) 0:22:56.385 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Monday 20 April 2026 16:07:52 -0400 (0:00:00.285) 0:22:56.671 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Monday 20 April 2026 16:07:52 -0400 (0:00:00.133) 0:22:56.804 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Monday 20 April 2026 16:07:53 -0400 (0:00:00.219) 0:22:57.024 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Monday 20 April 2026 16:07:53 -0400 (0:00:00.227) 0:22:57.251 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Monday 20 April 2026 16:07:53 -0400 (0:00:00.274) 0:22:57.525 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Monday 20 April 2026 16:07:53 -0400 (0:00:00.279) 0:22:57.804 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Monday 20 April 2026 16:07:54 -0400 (0:00:00.332) 0:22:58.137 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Monday 20 April 2026 16:07:54 -0400 (0:00:00.279) 0:22:58.416 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Monday 20 April 2026 16:07:54 -0400 (0:00:00.372) 0:22:58.835 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Monday 20 April 2026 16:07:55 -0400 (0:00:00.279) 0:22:59.115 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Monday 20 April 2026 16:07:55 -0400 (0:00:00.246) 0:22:59.361 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Monday 20 April 2026 16:07:55 -0400 (0:00:00.240) 0:22:59.601 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Monday 20 April 2026 16:07:55 -0400 (0:00:00.241) 0:22:59.843 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Monday 20 April 2026 16:07:56 -0400 (0:00:00.201) 0:23:00.044 ********** ok: [managed-node3] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Monday 20 April 2026 16:07:57 -0400 (0:00:01.757) 0:23:01.802 ********** ok: [managed-node3] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Monday 20 April 2026 16:07:59 -0400 (0:00:01.537) 0:23:03.339 ********** ok: [managed-node3] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [Show expected size] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Monday 20 April 2026 16:07:59 -0400 (0:00:00.365) 0:23:03.704 ********** ok: [managed-node3] => { "storage_test_expected_size": "4294967296" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Monday 20 April 2026 16:07:59 -0400 (0:00:00.214) 0:23:03.919 ********** ok: [managed-node3] => { "bytes": 10726680821, "changed": false, "lvm": "9g", "parted": "9GiB", "size": "9 GiB" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Monday 20 April 2026 16:08:01 -0400 (0:00:01.336) 0:23:05.256 ********** skipping: [managed-node3] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Monday 20 April 2026 16:08:01 -0400 (0:00:00.247) 0:23:05.503 ********** skipping: [managed-node3] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Monday 20 April 2026 16:08:01 -0400 (0:00:00.209) 0:23:05.712 ********** skipping: [managed-node3] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Monday 20 April 2026 16:08:01 -0400 (0:00:00.170) 0:23:05.883 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Monday 20 April 2026 16:08:02 -0400 (0:00:00.177) 0:23:06.060 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Monday 20 April 2026 16:08:02 -0400 (0:00:00.225) 0:23:06.286 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Monday 20 April 2026 16:08:02 -0400 (0:00:00.194) 0:23:06.481 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Monday 20 April 2026 16:08:02 -0400 (0:00:00.210) 0:23:06.691 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Monday 20 April 2026 16:08:03 -0400 (0:00:00.291) 0:23:06.982 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Monday 20 April 2026 16:08:03 -0400 (0:00:00.175) 0:23:07.157 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Monday 20 April 2026 16:08:03 -0400 (0:00:00.172) 0:23:07.330 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Monday 20 April 2026 16:08:03 -0400 (0:00:00.147) 0:23:07.477 ********** skipping: [managed-node3] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Monday 20 April 2026 16:08:03 -0400 (0:00:00.191) 0:23:07.668 ********** skipping: [managed-node3] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Monday 20 April 2026 16:08:03 -0400 (0:00:00.130) 0:23:07.799 ********** skipping: [managed-node3] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Monday 20 April 2026 16:08:03 -0400 (0:00:00.144) 0:23:07.943 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Monday 20 April 2026 16:08:04 -0400 (0:00:00.175) 0:23:08.119 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Monday 20 April 2026 16:08:04 -0400 (0:00:00.186) 0:23:08.305 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Monday 20 April 2026 16:08:04 -0400 (0:00:00.240) 0:23:08.546 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Monday 20 April 2026 16:08:04 -0400 (0:00:00.209) 0:23:08.755 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Monday 20 April 2026 16:08:05 -0400 (0:00:00.219) 0:23:08.975 ********** ok: [managed-node3] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Monday 20 April 2026 16:08:05 -0400 (0:00:00.232) 0:23:09.207 ********** ok: [managed-node3] => { "storage_test_expected_size": "4294967296" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Monday 20 April 2026 16:08:05 -0400 (0:00:00.238) 0:23:09.445 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Monday 20 April 2026 16:08:05 -0400 (0:00:00.302) 0:23:09.748 ********** ok: [managed-node3] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.025647", "end": "2026-04-20 16:08:07.150798", "rc": 0, "start": "2026-04-20 16:08:07.125151" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Monday 20 April 2026 16:08:07 -0400 (0:00:01.584) 0:23:11.332 ********** ok: [managed-node3] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [Check segment type] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Monday 20 April 2026 16:08:07 -0400 (0:00:00.218) 0:23:11.551 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Monday 20 April 2026 16:08:07 -0400 (0:00:00.253) 0:23:11.804 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Monday 20 April 2026 16:08:08 -0400 (0:00:00.199) 0:23:12.004 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Monday 20 April 2026 16:08:08 -0400 (0:00:00.268) 0:23:12.272 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Monday 20 April 2026 16:08:08 -0400 (0:00:00.348) 0:23:12.621 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Monday 20 April 2026 16:08:08 -0400 (0:00:00.233) 0:23:12.854 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Monday 20 April 2026 16:08:09 -0400 (0:00:00.243) 0:23:13.098 ********** TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Monday 20 April 2026 16:08:09 -0400 (0:00:00.174) 0:23:13.272 ********** ok: [managed-node3] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Monday 20 April 2026 16:08:09 -0400 (0:00:00.264) 0:23:13.537 ********** changed: [managed-node3] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode - 6] ****************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:461 Monday 20 April 2026 16:08:11 -0400 (0:00:01.514) 0:23:15.052 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node3 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Monday 20 April 2026 16:08:11 -0400 (0:00:00.485) 0:23:15.537 ********** ok: [managed-node3] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Monday 20 April 2026 16:08:11 -0400 (0:00:00.197) 0:23:15.734 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node3 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Monday 20 April 2026 16:08:12 -0400 (0:00:00.238) 0:23:15.973 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Monday 20 April 2026 16:08:12 -0400 (0:00:00.248) 0:23:16.222 ********** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Monday 20 April 2026 16:08:12 -0400 (0:00:00.311) 0:23:16.533 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Monday 20 April 2026 16:08:12 -0400 (0:00:00.204) 0:23:16.738 ********** ok: [managed-node3] TASK [fedora.linux_system_roles.storage : Record role begin fingerprint] ******* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Monday 20 April 2026 16:08:14 -0400 (0:00:02.106) 0:23:18.844 ********** ok: [managed-node3] => { "changed": false } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:13 Monday 20 April 2026 16:08:16 -0400 (0:00:01.535) 0:23:20.379 ********** skipping: [managed-node3] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node3] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node3] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node3] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:31 Monday 20 April 2026 16:08:16 -0400 (0:00:00.536) 0:23:20.916 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:36 Monday 20 April 2026 16:08:17 -0400 (0:00:00.936) 0:23:21.852 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Monday 20 April 2026 16:08:18 -0400 (0:00:00.110) 0:23:21.963 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Monday 20 April 2026 16:08:18 -0400 (0:00:00.075) 0:23:22.038 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Monday 20 April 2026 16:08:18 -0400 (0:00:00.108) 0:23:22.147 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Monday 20 April 2026 16:08:18 -0400 (0:00:00.402) 0:23:22.550 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Monday 20 April 2026 16:08:18 -0400 (0:00:00.198) 0:23:22.749 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Monday 20 April 2026 16:08:19 -0400 (0:00:00.201) 0:23:22.951 ********** ok: [managed-node3] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Monday 20 April 2026 16:08:23 -0400 (0:00:04.524) 0:23:27.475 ********** ok: [managed-node3] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Monday 20 April 2026 16:08:23 -0400 (0:00:00.265) 0:23:27.741 ********** ok: [managed-node3] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Monday 20 April 2026 16:08:24 -0400 (0:00:00.264) 0:23:28.006 ********** ok: [managed-node3] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup", "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Monday 20 April 2026 16:08:29 -0400 (0:00:05.626) 0:23:33.632 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Monday 20 April 2026 16:08:30 -0400 (0:00:00.424) 0:23:34.056 ********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Monday 20 April 2026 16:08:30 -0400 (0:00:00.200) 0:23:34.257 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Monday 20 April 2026 16:08:30 -0400 (0:00:00.204) 0:23:34.461 ********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Monday 20 April 2026 16:08:30 -0400 (0:00:00.123) 0:23:34.584 ********** ok: [managed-node3] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Monday 20 April 2026 16:08:35 -0400 (0:00:04.559) 0:23:39.144 ********** ok: [managed-node3] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luk...d4103\\x2d4a30\\x2d8816\\x2d49506ecb6f36.service": { "name": "systemd-cryptsetup@luk...d4103\\x2d4a30\\x2d8816\\x2d49506ecb6f36.service", "source": "systemd", "state": "inactive", "status": "generated" }, "systemd-cryptsetup@luks\\x2d54f1b36d\\x2d4103\\x2d4a30\\x2d8816\\x2d49506ecb6f36.service": { "name": "systemd-cryptsetup@luks\\x2d54f1b36d\\x2d4103\\x2d4a30\\x2d8816\\x2d49506ecb6f36.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Monday 20 April 2026 16:08:37 -0400 (0:00:02.590) 0:23:41.734 ********** changed: [managed-node3] => (item=systemd-cryptsetup@luks\x2d54f1b36d\x2d4103\x2d4a30\x2d8816\x2d49506ecb6f36.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d54f1b36d\\x2d4103\\x2d4a30\\x2d8816\\x2d49506ecb6f36.service", "name": "systemd-cryptsetup@luks\\x2d54f1b36d\\x2d4103\\x2d4a30\\x2d8816\\x2d49506ecb6f36.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "system-systemd\\x2dcryptsetup.slice systemd-journald.socket cryptsetup-pre.target dev-mapper-foo\\x2dtest1.device", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "cryptsetup.target umount.target", "BindsTo": "dev-mapper-foo\\x2dtest1.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-54f1b36d-4103-4a30-8816-49506ecb6f36", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-54f1b36d-4103-4a30-8816-49506ecb6f36 /dev/mapper/foo-test1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-54f1b36d-4103-4a30-8816-49506ecb6f36 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d54f1b36d\\x2d4103\\x2d4a30\\x2d8816\\x2d49506ecb6f36.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2d54f1b36d\\x2d4103\\x2d4a30\\x2d8816\\x2d49506ecb6f36.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2d54f1b36d\\x2d4103\\x2d4a30\\x2d8816\\x2d49506ecb6f36.service", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Mon 2026-04-20 16:03:45 EDT", "StateChangeTimestampMonotonic": "7787850292", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "infinity", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "WantedBy": "dev-mapper-foo\\x2dtest1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node3] => (item=systemd-cryptsetup@luk...d4103\x2d4a30\x2d8816\x2d49506ecb6f36.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...d4103\\x2d4a30\\x2d8816\\x2d49506ecb6f36.service", "name": "systemd-cryptsetup@luk...d4103\\x2d4a30\\x2d8816\\x2d49506ecb6f36.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...d4103\\x2d4a30\\x2d8816\\x2d49506ecb6f36.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...d4103\\x2d4a30\\x2d8816\\x2d49506ecb6f36.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.NoSuchUnit \"Unit systemd-cryptsetup@luk...d4103\\x2d4a30\\x2d8816\\x2d49506ecb6f36.service not found.\"", "LoadState": "not-found", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...d4103\\x2d4a30\\x2d8816\\x2d49506ecb6f36.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Monday 20 April 2026 16:08:40 -0400 (0:00:02.798) 0:23:44.532 ********** fatal: [managed-node3]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'test1' in safe mode due to adding encryption TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129 Monday 20 April 2026 16:08:45 -0400 (0:00:05.210) 0:23:49.743 ********** fatal: [managed-node3]: FAILED! => { "changed": false } MSG: {'msg': "cannot remove existing formatting on device 'test1' in safe mode due to adding encryption", 'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'invocation': {'module_args': {'pools': [{'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'lvm', 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None}]}], 'volumes': [], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': True, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Monday 20 April 2026 16:08:46 -0400 (0:00:00.226) 0:23:49.970 ********** changed: [managed-node3] => (item=systemd-cryptsetup@luks\x2d54f1b36d\x2d4103\x2d4a30\x2d8816\x2d49506ecb6f36.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d54f1b36d\\x2d4103\\x2d4a30\\x2d8816\\x2d49506ecb6f36.service", "name": "systemd-cryptsetup@luks\\x2d54f1b36d\\x2d4103\\x2d4a30\\x2d8816\\x2d49506ecb6f36.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d54f1b36d\\x2d4103\\x2d4a30\\x2d8816\\x2d49506ecb6f36.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2d54f1b36d\\x2d4103\\x2d4a30\\x2d8816\\x2d49506ecb6f36.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2d54f1b36d\\x2d4103\\x2d4a30\\x2d8816\\x2d49506ecb6f36.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2d54f1b36d\\x2d4103\\x2d4a30\\x2d8816\\x2d49506ecb6f36.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node3] => (item=systemd-cryptsetup@luk...d4103\x2d4a30\x2d8816\x2d49506ecb6f36.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...d4103\\x2d4a30\\x2d8816\\x2d49506ecb6f36.service", "name": "systemd-cryptsetup@luk...d4103\\x2d4a30\\x2d8816\\x2d49506ecb6f36.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...d4103\\x2d4a30\\x2d8816\\x2d49506ecb6f36.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...d4103\\x2d4a30\\x2d8816\\x2d49506ecb6f36.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luk...d4103\\x2d4a30\\x2d8816\\x2d49506ecb6f36.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...d4103\\x2d4a30\\x2d8816\\x2d49506ecb6f36.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:22 Monday 20 April 2026 16:08:48 -0400 (0:00:02.784) 0:23:52.755 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:27 Monday 20 April 2026 16:08:48 -0400 (0:00:00.109) 0:23:52.864 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:38 Monday 20 April 2026 16:08:49 -0400 (0:00:00.201) 0:23:53.067 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Monday 20 April 2026 16:08:49 -0400 (0:00:00.128) 0:23:53.195 ********** ok: [managed-node3] => { "changed": false, "stat": { "atime": 1776715690.8458781, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1776715690.8458781, "dev": 64768, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1776715690.8458781, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "2728577767", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Monday 20 April 2026 16:08:50 -0400 (0:00:00.993) 0:23:54.189 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Add encryption to the volume - 3] **************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:484 Monday 20 April 2026 16:08:50 -0400 (0:00:00.248) 0:23:54.437 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node3 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Monday 20 April 2026 16:08:50 -0400 (0:00:00.198) 0:23:54.636 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Monday 20 April 2026 16:08:50 -0400 (0:00:00.147) 0:23:54.783 ********** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Monday 20 April 2026 16:08:50 -0400 (0:00:00.077) 0:23:54.861 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Monday 20 April 2026 16:08:50 -0400 (0:00:00.077) 0:23:54.938 ********** ok: [managed-node3] TASK [fedora.linux_system_roles.storage : Record role begin fingerprint] ******* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Monday 20 April 2026 16:08:52 -0400 (0:00:01.492) 0:23:56.431 ********** ok: [managed-node3] => { "changed": false } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:13 Monday 20 April 2026 16:08:53 -0400 (0:00:01.232) 0:23:57.663 ********** skipping: [managed-node3] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node3] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node3] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node3] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:31 Monday 20 April 2026 16:08:54 -0400 (0:00:00.341) 0:23:58.005 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:36 Monday 20 April 2026 16:08:54 -0400 (0:00:00.166) 0:23:58.172 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Monday 20 April 2026 16:08:54 -0400 (0:00:00.209) 0:23:58.381 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Monday 20 April 2026 16:08:54 -0400 (0:00:00.079) 0:23:58.461 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Monday 20 April 2026 16:08:54 -0400 (0:00:00.127) 0:23:58.588 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Monday 20 April 2026 16:08:54 -0400 (0:00:00.236) 0:23:58.825 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Monday 20 April 2026 16:08:54 -0400 (0:00:00.033) 0:23:58.858 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Monday 20 April 2026 16:08:54 -0400 (0:00:00.039) 0:23:58.897 ********** ok: [managed-node3] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Monday 20 April 2026 16:08:58 -0400 (0:00:03.480) 0:24:02.378 ********** ok: [managed-node3] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Monday 20 April 2026 16:08:58 -0400 (0:00:00.234) 0:24:02.612 ********** ok: [managed-node3] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Monday 20 April 2026 16:08:58 -0400 (0:00:00.161) 0:24:02.773 ********** ok: [managed-node3] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup", "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Monday 20 April 2026 16:09:04 -0400 (0:00:05.302) 0:24:08.076 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Monday 20 April 2026 16:09:04 -0400 (0:00:00.265) 0:24:08.341 ********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Monday 20 April 2026 16:09:04 -0400 (0:00:00.289) 0:24:08.631 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Monday 20 April 2026 16:09:04 -0400 (0:00:00.205) 0:24:08.836 ********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Monday 20 April 2026 16:09:04 -0400 (0:00:00.065) 0:24:08.902 ********** ok: [managed-node3] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Monday 20 April 2026 16:09:09 -0400 (0:00:04.179) 0:24:13.082 ********** ok: [managed-node3] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Monday 20 April 2026 16:09:11 -0400 (0:00:02.655) 0:24:15.737 ********** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Monday 20 April 2026 16:09:12 -0400 (0:00:00.380) 0:24:16.118 ********** changed: [managed-node3] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-2dc86b75-b498-4d2b-9924-478500dc0996", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-2dc86b75-b498-4d2b-9924-478500dc0996", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-2dc86b75-b498-4d2b-9924-478500dc0996", "password": "-", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-2dc86b75-b498-4d2b-9924-478500dc0996" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-2dc86b75-b498-4d2b-9924-478500dc0996", "state": "mounted" } ], "packages": [ "cryptsetup", "lvm2", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-2dc86b75-b498-4d2b-9924-478500dc0996", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-2dc86b75-b498-4d2b-9924-478500dc0996", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:103 Monday 20 April 2026 16:09:26 -0400 (0:00:14.811) 0:24:30.969 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:110 Monday 20 April 2026 16:09:27 -0400 (0:00:00.184) 0:24:31.154 ********** ok: [managed-node3] => { "changed": false, "stat": { "atime": 1776715611.0010164, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "a1522684f5b6a445a50f2611a4e0757a4aec1cf1", "ctime": 1776715610.9970164, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 73400516, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1776715610.9970164, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1393, "uid": 0, "version": "319406980", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Monday 20 April 2026 16:09:28 -0400 (0:00:01.040) 0:24:32.194 ********** ok: [managed-node3] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Monday 20 April 2026 16:09:29 -0400 (0:00:00.965) 0:24:33.159 ********** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:139 Monday 20 April 2026 16:09:29 -0400 (0:00:00.290) 0:24:33.449 ********** ok: [managed-node3] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-2dc86b75-b498-4d2b-9924-478500dc0996", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-2dc86b75-b498-4d2b-9924-478500dc0996", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-2dc86b75-b498-4d2b-9924-478500dc0996", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-2dc86b75-b498-4d2b-9924-478500dc0996" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-2dc86b75-b498-4d2b-9924-478500dc0996", "state": "mounted" } ], "packages": [ "cryptsetup", "lvm2", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-2dc86b75-b498-4d2b-9924-478500dc0996", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-2dc86b75-b498-4d2b-9924-478500dc0996", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Monday 20 April 2026 16:09:29 -0400 (0:00:00.248) 0:24:33.698 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-2dc86b75-b498-4d2b-9924-478500dc0996", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-2dc86b75-b498-4d2b-9924-478500dc0996", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:152 Monday 20 April 2026 16:09:29 -0400 (0:00:00.199) 0:24:33.897 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:168 Monday 20 April 2026 16:09:30 -0400 (0:00:00.237) 0:24:34.135 ********** changed: [managed-node3] => (item={'src': '/dev/mapper/foo-test1', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:179 Monday 20 April 2026 16:09:31 -0400 (0:00:01.477) 0:24:35.613 ********** ok: [managed-node3] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:184 Monday 20 April 2026 16:09:33 -0400 (0:00:01.474) 0:24:37.087 ********** changed: [managed-node3] => (item={'src': '/dev/mapper/luks-2dc86b75-b498-4d2b-9924-478500dc0996', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-2dc86b75-b498-4d2b-9924-478500dc0996", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-2dc86b75-b498-4d2b-9924-478500dc0996" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Monday 20 April 2026 16:09:34 -0400 (0:00:01.329) 0:24:38.417 ********** skipping: [managed-node3] => (item={'src': '/dev/mapper/luks-2dc86b75-b498-4d2b-9924-478500dc0996', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-2dc86b75-b498-4d2b-9924-478500dc0996", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:207 Monday 20 April 2026 16:09:34 -0400 (0:00:00.292) 0:24:38.709 ********** ok: [managed-node3] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:215 Monday 20 April 2026 16:09:36 -0400 (0:00:01.807) 0:24:40.517 ********** ok: [managed-node3] => { "changed": false, "stat": { "atime": 1776715626.0419903, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1776715616.3310072, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 197132486, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1776715616.3290071, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "3294972155", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:220 Monday 20 April 2026 16:09:38 -0400 (0:00:01.574) 0:24:42.091 ********** changed: [managed-node3] => (item={'backing_device': '/dev/mapper/foo-test1', 'name': 'luks-2dc86b75-b498-4d2b-9924-478500dc0996', 'password': '-', 'state': 'present'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/mapper/foo-test1", "name": "luks-2dc86b75-b498-4d2b-9924-478500dc0996", "password": "-", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:242 Monday 20 April 2026 16:09:39 -0400 (0:00:01.635) 0:24:43.726 ********** ok: [managed-node3] TASK [fedora.linux_system_roles.storage : Record role success fingerprint] ***** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:16 Monday 20 April 2026 16:09:41 -0400 (0:00:01.995) 0:24:45.721 ********** ok: [managed-node3] => { "changed": false } TASK [Verify role results - 10] ************************************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:499 Monday 20 April 2026 16:09:43 -0400 (0:00:01.691) 0:24:47.413 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node3 TASK [Print out pool information] ********************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Monday 20 April 2026 16:09:43 -0400 (0:00:00.506) 0:24:47.920 ********** ok: [managed-node3] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-2dc86b75-b498-4d2b-9924-478500dc0996", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-2dc86b75-b498-4d2b-9924-478500dc0996", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Monday 20 April 2026 16:09:44 -0400 (0:00:00.240) 0:24:48.160 ********** skipping: [managed-node3] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Monday 20 April 2026 16:09:44 -0400 (0:00:00.197) 0:24:48.357 ********** ok: [managed-node3] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "2dc86b75-b498-4d2b-9924-478500dc0996" }, "/dev/mapper/luks-2dc86b75-b498-4d2b-9924-478500dc0996": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-2dc86b75-b498-4d2b-9924-478500dc0996", "size": "4G", "type": "crypt", "uuid": "7b4baa98-4e25-4732-bf61-f846dee85201" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "ueM618-T4yK-lFh2-IvIn-4a1q-YHsK-ucMUoN" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Monday 20 April 2026 16:09:45 -0400 (0:00:01.068) 0:24:49.426 ********** ok: [managed-node3] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002590", "end": "2026-04-20 16:09:46.425245", "rc": 0, "start": "2026-04-20 16:09:46.422655" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-2dc86b75-b498-4d2b-9924-478500dc0996 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Monday 20 April 2026 16:09:46 -0400 (0:00:01.145) 0:24:50.571 ********** ok: [managed-node3] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002609", "end": "2026-04-20 16:09:47.720198", "failed_when_result": false, "rc": 0, "start": "2026-04-20 16:09:47.717589" } STDOUT: luks-2dc86b75-b498-4d2b-9924-478500dc0996 /dev/mapper/foo-test1 - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Monday 20 April 2026 16:09:47 -0400 (0:00:01.343) 0:24:51.915 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node3 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Monday 20 April 2026 16:09:48 -0400 (0:00:00.395) 0:24:52.311 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Monday 20 April 2026 16:09:48 -0400 (0:00:00.235) 0:24:52.547 ********** ok: [managed-node3] => { "changed": false, "cmd": [ "vgs", "--noheadings", "--binary", "-o", "shared", "foo" ], "delta": "0:00:00.028120", "end": "2026-04-20 16:09:49.853778", "rc": 0, "start": "2026-04-20 16:09:49.825658" } STDOUT: 0 TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Monday 20 April 2026 16:09:50 -0400 (0:00:01.512) 0:24:54.059 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Monday 20 April 2026 16:09:50 -0400 (0:00:00.323) 0:24:54.383 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node3 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node3 TASK [Set test variables] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Monday 20 April 2026 16:09:50 -0400 (0:00:00.523) 0:24:54.907 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Monday 20 April 2026 16:09:51 -0400 (0:00:00.471) 0:24:55.378 ********** ok: [managed-node3] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Monday 20 April 2026 16:09:53 -0400 (0:00:01.585) 0:24:56.964 ********** ok: [managed-node3] => { "ansible_facts": { "__pvs_lvm_len": "1" }, "changed": false } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Monday 20 April 2026 16:09:53 -0400 (0:00:00.265) 0:24:57.229 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "changed": false } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Monday 20 April 2026 16:09:53 -0400 (0:00:00.221) 0:24:57.451 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Set expected pv type] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Monday 20 April 2026 16:09:53 -0400 (0:00:00.275) 0:24:57.727 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type - 2] ************************************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Monday 20 April 2026 16:09:54 -0400 (0:00:00.317) 0:24:58.044 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type - 3] ************************************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Monday 20 April 2026 16:09:54 -0400 (0:00:00.249) 0:24:58.294 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:55 Monday 20 April 2026 16:09:54 -0400 (0:00:00.283) 0:24:58.578 ********** ok: [managed-node3] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:68 Monday 20 April 2026 16:09:55 -0400 (0:00:00.371) 0:24:58.950 ********** ok: [managed-node3] => { "changed": false, "failed_when_result": false, "rc": 1 } STDERR: Shared connection to 10.31.13.113 closed. MSG: non-zero return code TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:78 Monday 20 April 2026 16:09:56 -0400 (0:00:01.369) 0:25:00.319 ********** skipping: [managed-node3] => (item=/dev/sda) => { "ansible_loop_var": "st_pool_pv", "changed": false, "skip_reason": "Conditional result was False", "st_pool_pv": "/dev/sda" } TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:88 Monday 20 April 2026 16:09:56 -0400 (0:00:00.207) 0:25:00.527 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node3 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Monday 20 April 2026 16:09:57 -0400 (0:00:00.440) 0:25:00.968 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Monday 20 April 2026 16:09:57 -0400 (0:00:00.242) 0:25:01.211 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Monday 20 April 2026 16:09:57 -0400 (0:00:00.242) 0:25:01.453 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Monday 20 April 2026 16:09:57 -0400 (0:00:00.296) 0:25:01.750 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Monday 20 April 2026 16:09:58 -0400 (0:00:00.263) 0:25:02.013 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Monday 20 April 2026 16:09:58 -0400 (0:00:00.224) 0:25:02.238 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Monday 20 April 2026 16:09:58 -0400 (0:00:00.288) 0:25:02.527 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Monday 20 April 2026 16:09:58 -0400 (0:00:00.213) 0:25:02.740 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Monday 20 April 2026 16:09:59 -0400 (0:00:00.286) 0:25:03.027 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Monday 20 April 2026 16:09:59 -0400 (0:00:00.315) 0:25:03.342 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Monday 20 April 2026 16:09:59 -0400 (0:00:00.275) 0:25:03.617 ********** ok: [managed-node3] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:91 Monday 20 April 2026 16:09:59 -0400 (0:00:00.160) 0:25:03.778 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node3 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Monday 20 April 2026 16:10:00 -0400 (0:00:00.372) 0:25:04.150 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml for managed-node3 TASK [Get information about the LV] ******************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:8 Monday 20 April 2026 16:10:00 -0400 (0:00:00.431) 0:25:04.582 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:16 Monday 20 April 2026 16:10:00 -0400 (0:00:00.066) 0:25:04.648 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:20 Monday 20 April 2026 16:10:00 -0400 (0:00:00.191) 0:25:04.840 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV stripe size] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:27 Monday 20 April 2026 16:10:01 -0400 (0:00:00.197) 0:25:05.038 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested stripe size] ***************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:31 Monday 20 April 2026 16:10:01 -0400 (0:00:00.172) 0:25:05.210 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected stripe size] ************************************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:37 Monday 20 April 2026 16:10:01 -0400 (0:00:00.212) 0:25:05.422 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check stripe size] ******************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:42 Monday 20 April 2026 16:10:01 -0400 (0:00:00.280) 0:25:05.703 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:94 Monday 20 April 2026 16:10:01 -0400 (0:00:00.198) 0:25:05.901 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node3 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Monday 20 April 2026 16:10:02 -0400 (0:00:00.325) 0:25:06.226 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml for managed-node3 TASK [Get information about thinpool] ****************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:8 Monday 20 April 2026 16:10:02 -0400 (0:00:00.391) 0:25:06.618 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in correct thinpool (when thinp name is provided)] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:16 Monday 20 April 2026 16:10:02 -0400 (0:00:00.273) 0:25:06.891 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in thinpool (when thinp name is not provided)] ****** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:22 Monday 20 April 2026 16:10:03 -0400 (0:00:00.232) 0:25:07.124 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:26 Monday 20 April 2026 16:10:03 -0400 (0:00:00.230) 0:25:07.354 ********** ok: [managed-node3] => { "ansible_facts": { "storage_test_thin_status": null }, "changed": false } TASK [Check member encryption] ************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:97 Monday 20 April 2026 16:10:03 -0400 (0:00:00.163) 0:25:07.518 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node3 TASK [Set test variables] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Monday 20 April 2026 16:10:04 -0400 (0:00:00.475) 0:25:07.994 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Monday 20 April 2026 16:10:04 -0400 (0:00:00.196) 0:25:08.191 ********** skipping: [managed-node3] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Monday 20 April 2026 16:10:04 -0400 (0:00:00.229) 0:25:08.420 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml for managed-node3 TASK [Set variables used by tests] ********************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:2 Monday 20 April 2026 16:10:04 -0400 (0:00:00.476) 0:25:08.897 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:6 Monday 20 April 2026 16:10:05 -0400 (0:00:00.251) 0:25:09.148 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:14 Monday 20 April 2026 16:10:05 -0400 (0:00:00.236) 0:25:09.385 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:23 Monday 20 April 2026 16:10:05 -0400 (0:00:00.253) 0:25:09.639 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:32 Monday 20 April 2026 16:10:06 -0400 (0:00:00.313) 0:25:09.953 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:41 Monday 20 April 2026 16:10:06 -0400 (0:00:00.201) 0:25:10.154 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [Clear test variables] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Monday 20 April 2026 16:10:06 -0400 (0:00:00.231) 0:25:10.385 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:100 Monday 20 April 2026 16:10:06 -0400 (0:00:00.145) 0:25:10.530 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node3 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Monday 20 April 2026 16:10:07 -0400 (0:00:00.553) 0:25:11.084 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml for managed-node3 TASK [Get information about VDO deduplication] ********************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:8 Monday 20 April 2026 16:10:07 -0400 (0:00:00.372) 0:25:11.456 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:15 Monday 20 April 2026 16:10:07 -0400 (0:00:00.392) 0:25:11.849 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:21 Monday 20 April 2026 16:10:08 -0400 (0:00:00.306) 0:25:12.155 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about VDO compression] *********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:27 Monday 20 April 2026 16:10:08 -0400 (0:00:00.190) 0:25:12.345 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off - 2] *********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:34 Monday 20 April 2026 16:10:08 -0400 (0:00:00.177) 0:25:12.523 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on - 2] ************************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:40 Monday 20 April 2026 16:10:08 -0400 (0:00:00.159) 0:25:12.683 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:46 Monday 20 April 2026 16:10:08 -0400 (0:00:00.193) 0:25:12.877 ********** ok: [managed-node3] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:103 Monday 20 April 2026 16:10:09 -0400 (0:00:00.114) 0:25:12.991 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node3 TASK [Get stratis pool information] ******************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Monday 20 April 2026 16:10:09 -0400 (0:00:00.380) 0:25:13.372 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print script output] ***************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Monday 20 April 2026 16:10:09 -0400 (0:00:00.325) 0:25:13.697 ********** skipping: [managed-node3] => {} TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:19 Monday 20 April 2026 16:10:10 -0400 (0:00:00.260) 0:25:13.957 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:23 Monday 20 April 2026 16:10:10 -0400 (0:00:00.238) 0:25:14.196 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:30 Monday 20 April 2026 16:10:10 -0400 (0:00:00.313) 0:25:14.510 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:39 Monday 20 April 2026 16:10:10 -0400 (0:00:00.188) 0:25:14.699 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:49 Monday 20 April 2026 16:10:10 -0400 (0:00:00.239) 0:25:14.938 ********** ok: [managed-node3] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:106 Monday 20 April 2026 16:10:11 -0400 (0:00:00.201) 0:25:15.139 ********** ok: [managed-node3] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Monday 20 April 2026 16:10:11 -0400 (0:00:00.185) 0:25:15.325 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node3 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Monday 20 April 2026 16:10:11 -0400 (0:00:00.462) 0:25:15.788 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Monday 20 April 2026 16:10:12 -0400 (0:00:00.292) 0:25:16.080 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node3 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node3 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node3 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node3 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node3 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node3 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node3 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node3 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Monday 20 April 2026 16:10:14 -0400 (0:00:02.197) 0:25:18.278 ********** ok: [managed-node3] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-2dc86b75-b498-4d2b-9924-478500dc0996" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Monday 20 April 2026 16:10:14 -0400 (0:00:00.329) 0:25:18.607 ********** ok: [managed-node3] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Monday 20 April 2026 16:10:14 -0400 (0:00:00.268) 0:25:18.876 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Monday 20 April 2026 16:10:15 -0400 (0:00:00.302) 0:25:19.178 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Monday 20 April 2026 16:10:15 -0400 (0:00:00.325) 0:25:19.504 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Monday 20 April 2026 16:10:15 -0400 (0:00:00.370) 0:25:19.874 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Monday 20 April 2026 16:10:16 -0400 (0:00:00.292) 0:25:20.167 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Monday 20 April 2026 16:10:16 -0400 (0:00:00.305) 0:25:20.472 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Monday 20 April 2026 16:10:16 -0400 (0:00:00.228) 0:25:20.701 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Monday 20 April 2026 16:10:17 -0400 (0:00:00.271) 0:25:20.972 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Monday 20 April 2026 16:10:17 -0400 (0:00:00.280) 0:25:21.252 ********** ok: [managed-node3] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Monday 20 April 2026 16:10:17 -0400 (0:00:00.144) 0:25:21.397 ********** ok: [managed-node3] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-2dc86b75-b498-4d2b-9924-478500dc0996 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Monday 20 April 2026 16:10:17 -0400 (0:00:00.456) 0:25:21.853 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Monday 20 April 2026 16:10:18 -0400 (0:00:00.215) 0:25:22.068 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Monday 20 April 2026 16:10:18 -0400 (0:00:00.250) 0:25:22.319 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Monday 20 April 2026 16:10:18 -0400 (0:00:00.227) 0:25:22.546 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Monday 20 April 2026 16:10:18 -0400 (0:00:00.243) 0:25:22.789 ********** ok: [managed-node3] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Monday 20 April 2026 16:10:19 -0400 (0:00:00.241) 0:25:23.031 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Monday 20 April 2026 16:10:19 -0400 (0:00:00.270) 0:25:23.302 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Monday 20 April 2026 16:10:19 -0400 (0:00:00.217) 0:25:23.519 ********** ok: [managed-node3] => { "changed": false, "stat": { "atime": 1776715766.467747, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1776715766.467747, "dev": 6, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 318516, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1776715766.467747, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Monday 20 April 2026 16:10:20 -0400 (0:00:01.208) 0:25:24.727 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Monday 20 April 2026 16:10:21 -0400 (0:00:00.287) 0:25:25.015 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Monday 20 April 2026 16:10:21 -0400 (0:00:00.193) 0:25:25.208 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Monday 20 April 2026 16:10:21 -0400 (0:00:00.273) 0:25:25.482 ********** ok: [managed-node3] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Monday 20 April 2026 16:10:21 -0400 (0:00:00.202) 0:25:25.685 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Monday 20 April 2026 16:10:21 -0400 (0:00:00.256) 0:25:25.941 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Monday 20 April 2026 16:10:22 -0400 (0:00:00.245) 0:25:26.187 ********** ok: [managed-node3] => { "changed": false, "stat": { "atime": 1776715766.6187468, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1776715766.6187468, "dev": 6, "device_type": 64769, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 336934, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1776715766.6187468, "nlink": 1, "path": "/dev/mapper/luks-2dc86b75-b498-4d2b-9924-478500dc0996", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Monday 20 April 2026 16:10:23 -0400 (0:00:01.695) 0:25:27.882 ********** ok: [managed-node3] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Monday 20 April 2026 16:10:28 -0400 (0:00:04.417) 0:25:32.300 ********** ok: [managed-node3] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/mapper/foo-test1" ], "delta": "0:00:00.010441", "end": "2026-04-20 16:10:29.710041", "rc": 0, "start": "2026-04-20 16:10:29.699600" } STDOUT: LUKS header information Version: 2 Epoch: 3 Metadata area: 16384 [bytes] Keyslots area: 16744448 [bytes] UUID: 2dc86b75-b498-4d2b-9924-478500dc0996 Label: (no label) Subsystem: (no subsystem) Flags: (no flags) Data segments: 0: crypt offset: 16777216 [bytes] length: (whole device) cipher: aes-xts-plain64 sector: 512 [bytes] Keyslots: 0: luks2 Key: 512 bits Priority: normal Cipher: aes-xts-plain64 Cipher key: 512 bits PBKDF: argon2i Time cost: 4 Memory: 936216 Threads: 2 Salt: 9d 33 8e ec dc 46 b2 ec 66 29 ef 37 d0 32 e0 7f 0b 89 c3 82 6a be 84 67 da 69 96 33 03 a7 36 b9 AF stripes: 4000 AF hash: sha256 Area offset:32768 [bytes] Area length:258048 [bytes] Digest ID: 0 Tokens: Digests: 0: pbkdf2 Hash: sha256 Iterations: 120029 Salt: 78 ef 89 ef 86 6c 5b e4 6d 25 cd 02 02 f7 37 7d 04 28 a0 e5 f8 5d 9e 9d be 0c 73 fa 5d 40 ce 37 Digest: 31 d9 9d 4c a2 98 a2 02 90 8f 78 d5 fd 0b 90 b1 bb 51 0c 83 13 ca d0 3b 5b 69 0d f2 ba 39 1f b7 TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Monday 20 April 2026 16:10:29 -0400 (0:00:01.633) 0:25:33.933 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Monday 20 April 2026 16:10:30 -0400 (0:00:00.365) 0:25:34.299 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Monday 20 April 2026 16:10:30 -0400 (0:00:00.426) 0:25:34.725 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Monday 20 April 2026 16:10:31 -0400 (0:00:00.293) 0:25:35.018 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Monday 20 April 2026 16:10:31 -0400 (0:00:00.332) 0:25:35.351 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Monday 20 April 2026 16:10:31 -0400 (0:00:00.340) 0:25:35.691 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Monday 20 April 2026 16:10:32 -0400 (0:00:00.480) 0:25:36.172 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Monday 20 April 2026 16:10:32 -0400 (0:00:00.275) 0:25:36.447 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-2dc86b75-b498-4d2b-9924-478500dc0996 /dev/mapper/foo-test1 -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Monday 20 April 2026 16:10:32 -0400 (0:00:00.377) 0:25:36.825 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Monday 20 April 2026 16:10:33 -0400 (0:00:00.299) 0:25:37.124 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Monday 20 April 2026 16:10:33 -0400 (0:00:00.269) 0:25:37.394 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Monday 20 April 2026 16:10:33 -0400 (0:00:00.319) 0:25:37.714 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Monday 20 April 2026 16:10:34 -0400 (0:00:00.313) 0:25:38.028 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Monday 20 April 2026 16:10:34 -0400 (0:00:00.171) 0:25:38.199 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Monday 20 April 2026 16:10:34 -0400 (0:00:00.192) 0:25:38.391 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Monday 20 April 2026 16:10:34 -0400 (0:00:00.170) 0:25:38.562 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Monday 20 April 2026 16:10:34 -0400 (0:00:00.239) 0:25:38.802 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Monday 20 April 2026 16:10:35 -0400 (0:00:00.295) 0:25:39.098 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Monday 20 April 2026 16:10:35 -0400 (0:00:00.281) 0:25:39.379 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Monday 20 April 2026 16:10:35 -0400 (0:00:00.240) 0:25:39.620 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Monday 20 April 2026 16:10:35 -0400 (0:00:00.263) 0:25:39.883 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Monday 20 April 2026 16:10:36 -0400 (0:00:00.336) 0:25:40.220 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Monday 20 April 2026 16:10:36 -0400 (0:00:00.280) 0:25:40.501 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Monday 20 April 2026 16:10:36 -0400 (0:00:00.226) 0:25:40.728 ********** ok: [managed-node3] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Monday 20 April 2026 16:10:38 -0400 (0:00:01.365) 0:25:42.093 ********** ok: [managed-node3] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Monday 20 April 2026 16:10:39 -0400 (0:00:01.491) 0:25:43.585 ********** ok: [managed-node3] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [Show expected size] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Monday 20 April 2026 16:10:39 -0400 (0:00:00.250) 0:25:43.836 ********** ok: [managed-node3] => { "storage_test_expected_size": "4294967296" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Monday 20 April 2026 16:10:40 -0400 (0:00:00.195) 0:25:44.031 ********** ok: [managed-node3] => { "bytes": 10726680821, "changed": false, "lvm": "9g", "parted": "9GiB", "size": "9 GiB" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Monday 20 April 2026 16:10:41 -0400 (0:00:01.578) 0:25:45.609 ********** skipping: [managed-node3] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Monday 20 April 2026 16:10:41 -0400 (0:00:00.297) 0:25:45.907 ********** skipping: [managed-node3] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Monday 20 April 2026 16:10:42 -0400 (0:00:00.228) 0:25:46.136 ********** skipping: [managed-node3] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Monday 20 April 2026 16:10:42 -0400 (0:00:00.313) 0:25:46.449 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Monday 20 April 2026 16:10:42 -0400 (0:00:00.263) 0:25:46.712 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Monday 20 April 2026 16:10:43 -0400 (0:00:00.283) 0:25:46.996 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Monday 20 April 2026 16:10:43 -0400 (0:00:00.194) 0:25:47.191 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Monday 20 April 2026 16:10:43 -0400 (0:00:00.152) 0:25:47.344 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Monday 20 April 2026 16:10:43 -0400 (0:00:00.107) 0:25:47.452 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Monday 20 April 2026 16:10:43 -0400 (0:00:00.120) 0:25:47.572 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Monday 20 April 2026 16:10:43 -0400 (0:00:00.155) 0:25:47.728 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Monday 20 April 2026 16:10:43 -0400 (0:00:00.209) 0:25:47.938 ********** skipping: [managed-node3] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Monday 20 April 2026 16:10:44 -0400 (0:00:00.148) 0:25:48.087 ********** skipping: [managed-node3] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Monday 20 April 2026 16:10:44 -0400 (0:00:00.163) 0:25:48.251 ********** skipping: [managed-node3] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Monday 20 April 2026 16:10:44 -0400 (0:00:00.164) 0:25:48.416 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Monday 20 April 2026 16:10:44 -0400 (0:00:00.184) 0:25:48.600 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Monday 20 April 2026 16:10:44 -0400 (0:00:00.168) 0:25:48.768 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Monday 20 April 2026 16:10:44 -0400 (0:00:00.153) 0:25:48.921 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Monday 20 April 2026 16:10:45 -0400 (0:00:00.174) 0:25:49.096 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Monday 20 April 2026 16:10:45 -0400 (0:00:00.195) 0:25:49.292 ********** ok: [managed-node3] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Monday 20 April 2026 16:10:45 -0400 (0:00:00.228) 0:25:49.520 ********** ok: [managed-node3] => { "storage_test_expected_size": "4294967296" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Monday 20 April 2026 16:10:45 -0400 (0:00:00.280) 0:25:49.800 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Monday 20 April 2026 16:10:46 -0400 (0:00:00.302) 0:25:50.103 ********** ok: [managed-node3] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.023557", "end": "2026-04-20 16:10:47.205408", "rc": 0, "start": "2026-04-20 16:10:47.181851" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Monday 20 April 2026 16:10:47 -0400 (0:00:01.285) 0:25:51.388 ********** ok: [managed-node3] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [Check segment type] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Monday 20 April 2026 16:10:47 -0400 (0:00:00.248) 0:25:51.637 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Monday 20 April 2026 16:10:47 -0400 (0:00:00.253) 0:25:51.891 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Monday 20 April 2026 16:10:48 -0400 (0:00:00.216) 0:25:52.108 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Monday 20 April 2026 16:10:48 -0400 (0:00:00.199) 0:25:52.307 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Monday 20 April 2026 16:10:48 -0400 (0:00:00.270) 0:25:52.578 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Monday 20 April 2026 16:10:48 -0400 (0:00:00.244) 0:25:52.822 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Monday 20 April 2026 16:10:49 -0400 (0:00:00.211) 0:25:53.033 ********** TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Monday 20 April 2026 16:10:49 -0400 (0:00:00.106) 0:25:53.139 ********** ok: [managed-node3] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Clean up] **************************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:502 Monday 20 April 2026 16:10:49 -0400 (0:00:00.166) 0:25:53.306 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node3 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Monday 20 April 2026 16:10:49 -0400 (0:00:00.404) 0:25:53.710 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Monday 20 April 2026 16:10:50 -0400 (0:00:00.293) 0:25:54.004 ********** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Monday 20 April 2026 16:10:50 -0400 (0:00:00.344) 0:25:54.348 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Monday 20 April 2026 16:10:50 -0400 (0:00:00.198) 0:25:54.546 ********** ok: [managed-node3] TASK [fedora.linux_system_roles.storage : Record role begin fingerprint] ******* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Monday 20 April 2026 16:10:52 -0400 (0:00:02.061) 0:25:56.608 ********** ok: [managed-node3] => { "changed": false } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:13 Monday 20 April 2026 16:10:54 -0400 (0:00:01.376) 0:25:57.985 ********** skipping: [managed-node3] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node3] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node3] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node3] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:31 Monday 20 April 2026 16:10:54 -0400 (0:00:00.417) 0:25:58.403 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:36 Monday 20 April 2026 16:10:54 -0400 (0:00:00.253) 0:25:58.656 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Monday 20 April 2026 16:10:54 -0400 (0:00:00.141) 0:25:58.798 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Monday 20 April 2026 16:10:54 -0400 (0:00:00.123) 0:25:58.922 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Monday 20 April 2026 16:10:55 -0400 (0:00:00.208) 0:25:59.130 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Monday 20 April 2026 16:10:55 -0400 (0:00:00.794) 0:25:59.925 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Monday 20 April 2026 16:10:56 -0400 (0:00:00.174) 0:26:00.099 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Monday 20 April 2026 16:10:56 -0400 (0:00:00.097) 0:26:00.197 ********** ok: [managed-node3] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Monday 20 April 2026 16:11:00 -0400 (0:00:04.112) 0:26:04.310 ********** ok: [managed-node3] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Monday 20 April 2026 16:11:00 -0400 (0:00:00.308) 0:26:04.618 ********** ok: [managed-node3] => { "storage_volumes | d([])": [ { "disks": [ "sda" ], "name": "foo", "state": "absent", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Monday 20 April 2026 16:11:00 -0400 (0:00:00.291) 0:26:04.910 ********** ok: [managed-node3] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Monday 20 April 2026 16:11:06 -0400 (0:00:05.543) 0:26:10.453 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Monday 20 April 2026 16:11:06 -0400 (0:00:00.392) 0:26:10.846 ********** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Monday 20 April 2026 16:11:07 -0400 (0:00:00.184) 0:26:11.030 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Monday 20 April 2026 16:11:07 -0400 (0:00:00.283) 0:26:11.313 ********** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Monday 20 April 2026 16:11:07 -0400 (0:00:00.129) 0:26:11.442 ********** ok: [managed-node3] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Monday 20 April 2026 16:11:11 -0400 (0:00:04.308) 0:26:15.751 ********** ok: [managed-node3] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Monday 20 April 2026 16:11:14 -0400 (0:00:03.020) 0:26:18.772 ********** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Monday 20 April 2026 16:11:15 -0400 (0:00:00.276) 0:26:19.049 ********** changed: [managed-node3] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-2dc86b75-b498-4d2b-9924-478500dc0996", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-2dc86b75-b498-4d2b-9924-478500dc0996", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "destroy device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "lvmpv" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-2dc86b75-b498-4d2b-9924-478500dc0996", "password": "-", "state": "absent" } ], "leaves": [ "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-2dc86b75-b498-4d2b-9924-478500dc0996", "state": "absent" } ], "packages": [ "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_mount_id": "UUID=ueM618-T4yK-lFh2-IvIn-4a1q-YHsK-ucMUoN", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "lvmpv", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "absent", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:103 Monday 20 April 2026 16:11:21 -0400 (0:00:05.965) 0:26:25.015 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:110 Monday 20 April 2026 16:11:21 -0400 (0:00:00.244) 0:26:25.260 ********** ok: [managed-node3] => { "changed": false, "stat": { "atime": 1776715774.2627335, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "f5f9efe6295afb9c0faf4f3be83d079e4870d1cb", "ctime": 1776715774.2597334, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 73400516, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1776715774.2597334, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1425, "uid": 0, "version": "319406980", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Monday 20 April 2026 16:11:22 -0400 (0:00:01.550) 0:26:26.810 ********** ok: [managed-node3] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Monday 20 April 2026 16:11:24 -0400 (0:00:01.440) 0:26:28.250 ********** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:139 Monday 20 April 2026 16:11:24 -0400 (0:00:00.443) 0:26:28.693 ********** ok: [managed-node3] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-2dc86b75-b498-4d2b-9924-478500dc0996", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-2dc86b75-b498-4d2b-9924-478500dc0996", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "destroy device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "lvmpv" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-2dc86b75-b498-4d2b-9924-478500dc0996", "password": "-", "state": "absent" } ], "failed": false, "leaves": [ "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-2dc86b75-b498-4d2b-9924-478500dc0996", "state": "absent" } ], "packages": [ "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_mount_id": "UUID=ueM618-T4yK-lFh2-IvIn-4a1q-YHsK-ucMUoN", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "lvmpv", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "absent", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Monday 20 April 2026 16:11:25 -0400 (0:00:00.391) 0:26:29.085 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:152 Monday 20 April 2026 16:11:25 -0400 (0:00:00.260) 0:26:29.345 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/sda", "_mount_id": "UUID=ueM618-T4yK-lFh2-IvIn-4a1q-YHsK-ucMUoN", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "lvmpv", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "absent", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:168 Monday 20 April 2026 16:11:25 -0400 (0:00:00.244) 0:26:29.590 ********** changed: [managed-node3] => (item={'src': '/dev/mapper/luks-2dc86b75-b498-4d2b-9924-478500dc0996', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-2dc86b75-b498-4d2b-9924-478500dc0996", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-2dc86b75-b498-4d2b-9924-478500dc0996" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:179 Monday 20 April 2026 16:11:27 -0400 (0:00:01.714) 0:26:31.304 ********** ok: [managed-node3] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:184 Monday 20 April 2026 16:11:29 -0400 (0:00:01.982) 0:26:33.287 ********** TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Monday 20 April 2026 16:11:29 -0400 (0:00:00.162) 0:26:33.449 ********** TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:207 Monday 20 April 2026 16:11:29 -0400 (0:00:00.178) 0:26:33.628 ********** ok: [managed-node3] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:215 Monday 20 April 2026 16:11:31 -0400 (0:00:01.893) 0:26:35.521 ********** ok: [managed-node3] => { "changed": false, "stat": { "atime": 1776715787.7187102, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "9a310f9c8284e6ec9f609638a7f5564b1fa5c25f", "ctime": 1776715779.4907243, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 373293197, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1776715779.4887245, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 66, "uid": 0, "version": "2722908372", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:220 Monday 20 April 2026 16:11:33 -0400 (0:00:01.677) 0:26:37.198 ********** changed: [managed-node3] => (item={'backing_device': '/dev/mapper/foo-test1', 'name': 'luks-2dc86b75-b498-4d2b-9924-478500dc0996', 'password': '-', 'state': 'absent'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/mapper/foo-test1", "name": "luks-2dc86b75-b498-4d2b-9924-478500dc0996", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:242 Monday 20 April 2026 16:11:35 -0400 (0:00:01.977) 0:26:39.176 ********** ok: [managed-node3] TASK [fedora.linux_system_roles.storage : Record role success fingerprint] ***** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:16 Monday 20 April 2026 16:11:37 -0400 (0:00:01.894) 0:26:41.071 ********** ok: [managed-node3] => { "changed": false } TASK [Verify role results - 11] ************************************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:511 Monday 20 April 2026 16:11:38 -0400 (0:00:01.591) 0:26:42.662 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node3 TASK [Print out pool information] ********************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Monday 20 April 2026 16:11:39 -0400 (0:00:00.611) 0:26:43.274 ********** skipping: [managed-node3] => {} TASK [Print out volume information] ******************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Monday 20 April 2026 16:11:39 -0400 (0:00:00.370) 0:26:43.710 ********** ok: [managed-node3] => { "_storage_volumes_list": [ { "_device": "/dev/sda", "_mount_id": "UUID=ueM618-T4yK-lFh2-IvIn-4a1q-YHsK-ucMUoN", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "lvmpv", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "absent", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Monday 20 April 2026 16:11:40 -0400 (0:00:00.338) 0:26:44.048 ********** ok: [managed-node3] => { "changed": false, "info": { "/dev/sda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Monday 20 April 2026 16:11:41 -0400 (0:00:01.672) 0:26:45.721 ********** ok: [managed-node3] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002656", "end": "2026-04-20 16:11:43.177426", "rc": 0, "start": "2026-04-20 16:11:43.174770" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Monday 20 April 2026 16:11:43 -0400 (0:00:01.747) 0:26:47.468 ********** ok: [managed-node3] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002663", "end": "2026-04-20 16:11:44.541404", "failed_when_result": false, "rc": 0, "start": "2026-04-20 16:11:44.538741" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Monday 20 April 2026 16:11:44 -0400 (0:00:01.238) 0:26:48.707 ********** TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Monday 20 April 2026 16:11:44 -0400 (0:00:00.148) 0:26:48.855 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node3 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Monday 20 April 2026 16:11:45 -0400 (0:00:00.355) 0:26:49.211 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_test_volume_present": false, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Monday 20 April 2026 16:11:45 -0400 (0:00:00.267) 0:26:49.478 ********** included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node3 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node3 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node3 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node3 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node3 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node3 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node3 included: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node3 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Monday 20 April 2026 16:11:46 -0400 (0:00:01.160) 0:26:50.639 ********** ok: [managed-node3] => { "ansible_facts": { "storage_test_device_path": "/dev/sda" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Monday 20 April 2026 16:11:46 -0400 (0:00:00.260) 0:26:50.900 ********** ok: [managed-node3] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Monday 20 April 2026 16:11:47 -0400 (0:00:00.342) 0:26:51.242 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Monday 20 April 2026 16:11:47 -0400 (0:00:00.344) 0:26:51.587 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Monday 20 April 2026 16:11:47 -0400 (0:00:00.132) 0:26:51.720 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Monday 20 April 2026 16:11:47 -0400 (0:00:00.186) 0:26:51.907 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Monday 20 April 2026 16:11:48 -0400 (0:00:00.220) 0:26:52.127 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Monday 20 April 2026 16:11:48 -0400 (0:00:00.135) 0:26:52.263 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Monday 20 April 2026 16:11:48 -0400 (0:00:00.131) 0:26:52.394 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Monday 20 April 2026 16:11:48 -0400 (0:00:00.141) 0:26:52.535 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Monday 20 April 2026 16:11:48 -0400 (0:00:00.165) 0:26:52.701 ********** ok: [managed-node3] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Monday 20 April 2026 16:11:48 -0400 (0:00:00.117) 0:26:52.819 ********** ok: [managed-node3] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "0", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Monday 20 April 2026 16:11:49 -0400 (0:00:00.334) 0:26:53.153 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Monday 20 April 2026 16:11:49 -0400 (0:00:00.287) 0:26:53.440 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Monday 20 April 2026 16:11:49 -0400 (0:00:00.257) 0:26:53.698 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Monday 20 April 2026 16:11:50 -0400 (0:00:00.307) 0:26:54.005 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Monday 20 April 2026 16:11:50 -0400 (0:00:00.252) 0:26:54.258 ********** ok: [managed-node3] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Monday 20 April 2026 16:11:50 -0400 (0:00:00.197) 0:26:54.455 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fs label] ********************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Monday 20 April 2026 16:11:50 -0400 (0:00:00.314) 0:26:54.769 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [See whether the device node is present] ********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Monday 20 April 2026 16:11:51 -0400 (0:00:00.245) 0:26:55.014 ********** ok: [managed-node3] => { "changed": false, "stat": { "atime": 1776715880.758549, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1776715880.758549, "dev": 6, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 38585, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1776715880.758549, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Monday 20 April 2026 16:11:52 -0400 (0:00:01.533) 0:26:56.548 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Monday 20 April 2026 16:11:52 -0400 (0:00:00.359) 0:26:56.907 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Monday 20 April 2026 16:11:53 -0400 (0:00:00.222) 0:26:57.129 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Monday 20 April 2026 16:11:53 -0400 (0:00:00.157) 0:26:57.287 ********** ok: [managed-node3] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Monday 20 April 2026 16:11:53 -0400 (0:00:00.278) 0:26:57.566 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Monday 20 April 2026 16:11:53 -0400 (0:00:00.258) 0:26:57.824 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Monday 20 April 2026 16:11:54 -0400 (0:00:00.214) 0:26:58.039 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Monday 20 April 2026 16:11:54 -0400 (0:00:00.306) 0:26:58.345 ********** ok: [managed-node3] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Monday 20 April 2026 16:11:58 -0400 (0:00:04.301) 0:27:02.647 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Monday 20 April 2026 16:11:58 -0400 (0:00:00.216) 0:27:02.863 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Monday 20 April 2026 16:11:59 -0400 (0:00:00.115) 0:27:02.979 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Monday 20 April 2026 16:11:59 -0400 (0:00:00.202) 0:27:03.182 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Monday 20 April 2026 16:11:59 -0400 (0:00:00.156) 0:27:03.339 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Monday 20 April 2026 16:11:59 -0400 (0:00:00.205) 0:27:03.545 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Monday 20 April 2026 16:11:59 -0400 (0:00:00.193) 0:27:03.738 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Monday 20 April 2026 16:12:00 -0400 (0:00:00.219) 0:27:03.957 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Monday 20 April 2026 16:12:00 -0400 (0:00:00.251) 0:27:04.209 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Monday 20 April 2026 16:12:00 -0400 (0:00:00.344) 0:27:04.554 ********** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Monday 20 April 2026 16:12:00 -0400 (0:00:00.245) 0:27:04.799 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Monday 20 April 2026 16:12:01 -0400 (0:00:00.238) 0:27:05.037 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Monday 20 April 2026 16:12:01 -0400 (0:00:00.195) 0:27:05.233 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Monday 20 April 2026 16:12:01 -0400 (0:00:00.288) 0:27:05.522 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Monday 20 April 2026 16:12:01 -0400 (0:00:00.287) 0:27:05.809 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Monday 20 April 2026 16:12:02 -0400 (0:00:00.237) 0:27:06.047 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Monday 20 April 2026 16:12:02 -0400 (0:00:00.243) 0:27:06.291 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Monday 20 April 2026 16:12:02 -0400 (0:00:00.335) 0:27:06.626 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Monday 20 April 2026 16:12:02 -0400 (0:00:00.254) 0:27:06.880 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Monday 20 April 2026 16:12:03 -0400 (0:00:00.307) 0:27:07.188 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Monday 20 April 2026 16:12:03 -0400 (0:00:00.296) 0:27:07.484 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Monday 20 April 2026 16:12:03 -0400 (0:00:00.231) 0:27:07.716 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Monday 20 April 2026 16:12:04 -0400 (0:00:00.330) 0:27:08.046 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Monday 20 April 2026 16:12:04 -0400 (0:00:00.259) 0:27:08.306 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Monday 20 April 2026 16:12:04 -0400 (0:00:00.277) 0:27:08.583 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Monday 20 April 2026 16:12:04 -0400 (0:00:00.278) 0:27:08.862 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Monday 20 April 2026 16:12:05 -0400 (0:00:00.326) 0:27:09.188 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Monday 20 April 2026 16:12:05 -0400 (0:00:00.214) 0:27:09.402 ********** ok: [managed-node3] => { "storage_test_expected_size": "4294967296" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Monday 20 April 2026 16:12:05 -0400 (0:00:00.257) 0:27:09.660 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Monday 20 April 2026 16:12:06 -0400 (0:00:00.337) 0:27:09.997 ********** skipping: [managed-node3] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Monday 20 April 2026 16:12:06 -0400 (0:00:00.261) 0:27:10.258 ********** skipping: [managed-node3] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Monday 20 April 2026 16:12:06 -0400 (0:00:00.308) 0:27:10.566 ********** skipping: [managed-node3] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Monday 20 April 2026 16:12:06 -0400 (0:00:00.356) 0:27:10.923 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Monday 20 April 2026 16:12:07 -0400 (0:00:00.373) 0:27:11.296 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Monday 20 April 2026 16:12:07 -0400 (0:00:00.268) 0:27:11.565 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Monday 20 April 2026 16:12:07 -0400 (0:00:00.196) 0:27:11.761 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Monday 20 April 2026 16:12:08 -0400 (0:00:00.256) 0:27:12.018 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Monday 20 April 2026 16:12:08 -0400 (0:00:00.272) 0:27:12.291 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Monday 20 April 2026 16:12:08 -0400 (0:00:00.293) 0:27:12.585 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Monday 20 April 2026 16:12:08 -0400 (0:00:00.248) 0:27:12.834 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Monday 20 April 2026 16:12:09 -0400 (0:00:00.311) 0:27:13.145 ********** skipping: [managed-node3] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Monday 20 April 2026 16:12:09 -0400 (0:00:00.322) 0:27:13.468 ********** skipping: [managed-node3] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Monday 20 April 2026 16:12:09 -0400 (0:00:00.292) 0:27:13.760 ********** skipping: [managed-node3] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Monday 20 April 2026 16:12:10 -0400 (0:00:00.262) 0:27:14.023 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Monday 20 April 2026 16:12:10 -0400 (0:00:00.177) 0:27:14.200 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Monday 20 April 2026 16:12:10 -0400 (0:00:00.239) 0:27:14.439 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Monday 20 April 2026 16:12:10 -0400 (0:00:00.208) 0:27:14.648 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Monday 20 April 2026 16:12:10 -0400 (0:00:00.213) 0:27:14.861 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Monday 20 April 2026 16:12:11 -0400 (0:00:00.244) 0:27:15.106 ********** ok: [managed-node3] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Monday 20 April 2026 16:12:11 -0400 (0:00:00.197) 0:27:15.304 ********** ok: [managed-node3] => { "storage_test_expected_size": "4294967296" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Monday 20 April 2026 16:12:11 -0400 (0:00:00.226) 0:27:15.530 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Monday 20 April 2026 16:12:11 -0400 (0:00:00.236) 0:27:15.766 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Monday 20 April 2026 16:12:12 -0400 (0:00:00.190) 0:27:15.956 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Monday 20 April 2026 16:12:12 -0400 (0:00:00.253) 0:27:16.209 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Monday 20 April 2026 16:12:12 -0400 (0:00:00.251) 0:27:16.461 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Monday 20 April 2026 16:12:12 -0400 (0:00:00.241) 0:27:16.702 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Monday 20 April 2026 16:12:12 -0400 (0:00:00.209) 0:27:16.912 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Monday 20 April 2026 16:12:13 -0400 (0:00:00.237) 0:27:17.149 ********** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Monday 20 April 2026 16:12:13 -0400 (0:00:00.203) 0:27:17.353 ********** ok: [managed-node3] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Monday 20 April 2026 16:12:13 -0400 (0:00:00.162) 0:27:17.515 ********** ok: [managed-node3] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* managed-node3 : ok=1278 changed=60 unreachable=0 failed=9 skipped=1115 rescued=9 ignored=0 SYSTEM ROLES ERRORS BEGIN v1 [ { "ansible_version": "2.9.27", "end_time": "2026-04-20T19:46:23.989557+00:00Z", "host": "managed-node3", "message": "encrypted volume 'foo' missing key/password", "start_time": "2026-04-20T19:46:18.578036+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88" }, { "ansible_version": "2.9.27", "end_time": "2026-04-20T19:46:24.314745+00:00Z", "host": "managed-node3", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [], "safe_mode": true, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } }, "leaves": [], "mounts": [], "msg": "encrypted volume 'foo' missing key/password", "packages": [], "pools": [], "volumes": [] }, "start_time": "2026-04-20T19:46:23.996839+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129" }, { "ansible_version": "2.9.27", "end_time": "2026-04-20T19:48:45.680296+00:00Z", "host": "managed-node3", "message": "cannot remove existing formatting on device 'luks-95fcf1a3-b37a-4b7a-9bf2-8d197d787770' in safe mode due to encryption removal", "start_time": "2026-04-20T19:48:40.099777+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88" }, { "ansible_version": "2.9.27", "end_time": "2026-04-20T19:48:45.882900+00:00Z", "host": "managed-node3", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [], "safe_mode": true, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10720641024, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } }, "leaves": [], "mounts": [], "msg": "cannot remove existing formatting on device 'luks-95fcf1a3-b37a-4b7a-9bf2-8d197d787770' in safe mode due to encryption removal", "packages": [], "pools": [], "volumes": [] }, "start_time": "2026-04-20T19:48:45.711285+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129" }, { "ansible_version": "2.9.27", "end_time": "2026-04-20T19:50:51.809270+00:00Z", "host": "managed-node3", "message": "cannot remove existing formatting on device 'sda' in safe mode due to adding encryption", "start_time": "2026-04-20T19:50:46.997644+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88" }, { "ansible_version": "2.9.27", "end_time": "2026-04-20T19:50:52.272602+00:00Z", "host": "managed-node3", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [], "safe_mode": true, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } }, "leaves": [], "mounts": [], "msg": "cannot remove existing formatting on device 'sda' in safe mode due to adding encryption", "packages": [], "pools": [], "volumes": [] }, "start_time": "2026-04-20T19:50:51.958543+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129" }, { "ansible_version": "2.9.27", "end_time": "2026-04-20T19:53:02.343453+00:00Z", "host": "managed-node3", "message": "encrypted volume 'test1' missing key/password", "start_time": "2026-04-20T19:52:56.855192+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88" }, { "ansible_version": "2.9.27", "end_time": "2026-04-20T19:53:02.587064+00:00Z", "host": "managed-node3", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "safe_mode": false, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [] } }, "leaves": [], "mounts": [], "msg": "encrypted volume 'test1' missing key/password", "packages": [], "pools": [], "volumes": [] }, "start_time": "2026-04-20T19:53:02.350251+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129" }, { "ansible_version": "2.9.27", "end_time": "2026-04-20T19:55:39.969744+00:00Z", "host": "managed-node3", "message": "cannot remove existing formatting on device 'luks-36889b12-f60f-49c7-ac51-807f73989f4a' in safe mode due to encryption removal", "start_time": "2026-04-20T19:55:34.368501+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88" }, { "ansible_version": "2.9.27", "end_time": "2026-04-20T19:55:40.301561+00:00Z", "host": "managed-node3", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "safe_mode": true, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [] } }, "leaves": [], "mounts": [], "msg": "cannot remove existing formatting on device 'luks-36889b12-f60f-49c7-ac51-807f73989f4a' in safe mode due to encryption removal", "packages": [], "pools": [], "volumes": [] }, "start_time": "2026-04-20T19:55:39.980417+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129" }, { "ansible_version": "2.9.27", "end_time": "2026-04-20T19:58:10.279469+00:00Z", "host": "managed-node3", "message": "cannot remove existing formatting on device 'sda1' in safe mode due to adding encryption", "start_time": "2026-04-20T19:58:04.303709+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88" }, { "ansible_version": "2.9.27", "end_time": "2026-04-20T19:58:10.581542+00:00Z", "host": "managed-node3", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "safe_mode": true, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [] } }, "leaves": [], "mounts": [], "msg": "cannot remove existing formatting on device 'sda1' in safe mode due to adding encryption", "packages": [], "pools": [], "volumes": [] }, "start_time": "2026-04-20T19:58:10.287365+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129" }, { "ansible_version": "2.9.27", "end_time": "2026-04-20T20:00:55.401255+00:00Z", "host": "managed-node3", "message": "encrypted volume 'test1' missing key/password", "start_time": "2026-04-20T20:00:49.989133+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88" }, { "ansible_version": "2.9.27", "end_time": "2026-04-20T20:00:55.644147+00:00Z", "host": "managed-node3", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "safe_mode": false, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [] } }, "leaves": [], "mounts": [], "msg": "encrypted volume 'test1' missing key/password", "packages": [], "pools": [], "volumes": [] }, "start_time": "2026-04-20T20:00:55.429084+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129" }, { "ansible_version": "2.9.27", "end_time": "2026-04-20T20:05:54.292126+00:00Z", "host": "managed-node3", "message": "cannot remove existing formatting on device 'luks-54f1b36d-4103-4a30-8816-49506ecb6f36' in safe mode due to encryption removal", "start_time": "2026-04-20T20:05:48.821887+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88" }, { "ansible_version": "2.9.27", "end_time": "2026-04-20T20:05:54.576565+00:00Z", "host": "managed-node3", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "safe_mode": true, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [] } }, "leaves": [], "mounts": [], "msg": "cannot remove existing formatting on device 'luks-54f1b36d-4103-4a30-8816-49506ecb6f36' in safe mode due to encryption removal", "packages": [], "pools": [], "volumes": [] }, "start_time": "2026-04-20T20:05:54.356929+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129" }, { "ansible_version": "2.9.27", "end_time": "2026-04-20T20:08:45.789231+00:00Z", "host": "managed-node3", "message": "cannot remove existing formatting on device 'test1' in safe mode due to adding encryption", "start_time": "2026-04-20T20:08:40.590289+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88" }, { "ansible_version": "2.9.27", "end_time": "2026-04-20T20:08:46.015674+00:00Z", "host": "managed-node3", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "safe_mode": true, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [] } }, "leaves": [], "mounts": [], "msg": "cannot remove existing formatting on device 'test1' in safe mode due to adding encryption", "packages": [], "pools": [], "volumes": [] }, "start_time": "2026-04-20T20:08:45.796682+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129" } ] SYSTEM ROLES ERRORS END v1 TASKS RECAP ******************************************************************** Monday 20 April 2026 16:12:13 -0400 (0:00:00.211) 0:27:17.726 ********** =============================================================================== fedora.linux_system_roles.storage : Record role success fingerprint ---- 19.06s /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:16 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 14.85s /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 14.05s /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 13.97s /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 13.94s /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 13.68s /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 12.40s /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 fedora.linux_system_roles.storage : Include the appropriate provider tasks -- 11.18s /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 6.36s /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 6.28s /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 6.01s /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 5.98s /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 5.93s /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 fedora.linux_system_roles.storage : Get required packages --------------- 5.85s /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 fedora.linux_system_roles.storage : Get required packages --------------- 5.73s /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 fedora.linux_system_roles.storage : Get required packages --------------- 5.72s /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 fedora.linux_system_roles.storage : Get required packages --------------- 5.69s /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 fedora.linux_system_roles.storage : Get required packages --------------- 5.63s /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 5.61s /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 5.60s /tmp/collections-hb0/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88