ansible-playbook 2.9.27 config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.9/site-packages/ansible executable location = /usr/local/bin/ansible-playbook python version = 3.9.19 (main, May 16 2024, 11:40:09) [GCC 8.5.0 20210514 (Red Hat 8.5.0-22)] No config file found; using defaults [WARNING]: running playbook inside collection fedora.linux_system_roles statically imported: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml statically imported: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml statically imported: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml statically imported: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml statically imported: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml statically imported: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'jsonl', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: tests_luks2.yml ****************************************************** 1 plays in /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml PLAY [Test LUKS2] ************************************************************** TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:2 Friday 30 January 2026 18:36:21 -0500 (0:00:00.371) 0:00:00.371 ******** ok: [managed-node2] META: ran handlers TASK [Enable FIPS mode] ******************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:20 Friday 30 January 2026 18:36:25 -0500 (0:00:04.491) 0:00:04.863 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reboot] ****************************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:28 Friday 30 January 2026 18:36:25 -0500 (0:00:00.357) 0:00:05.220 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Enable FIPS mode - 2] **************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:39 Friday 30 January 2026 18:36:26 -0500 (0:00:00.147) 0:00:05.368 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reboot - 2] ************************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:43 Friday 30 January 2026 18:36:26 -0500 (0:00:00.600) 0:00:05.968 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure dracut-fips] ****************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:53 Friday 30 January 2026 18:36:26 -0500 (0:00:00.287) 0:00:06.256 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Configure boot for FIPS] ************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:59 Friday 30 January 2026 18:36:27 -0500 (0:00:00.316) 0:00:06.572 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reboot - 3] ************************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:68 Friday 30 January 2026 18:36:27 -0500 (0:00:00.403) 0:00:06.976 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role] ************************************************************ task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:72 Friday 30 January 2026 18:36:28 -0500 (0:00:00.449) 0:00:07.425 ******** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Friday 30 January 2026 18:36:28 -0500 (0:00:00.368) 0:00:07.794 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Friday 30 January 2026 18:36:28 -0500 (0:00:00.318) 0:00:08.112 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Friday 30 January 2026 18:36:29 -0500 (0:00:00.450) 0:00:08.562 ******** skipping: [managed-node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node2] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node2] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Friday 30 January 2026 18:36:29 -0500 (0:00:00.715) 0:00:09.278 ******** ok: [managed-node2] => { "changed": false, "stat": { "exists": false } } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Friday 30 January 2026 18:36:32 -0500 (0:00:02.718) 0:00:11.996 ******** ok: [managed-node2] => { "ansible_facts": { "__storage_is_ostree": false }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Friday 30 January 2026 18:36:33 -0500 (0:00:00.483) 0:00:12.480 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Friday 30 January 2026 18:36:33 -0500 (0:00:00.107) 0:00:12.588 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Friday 30 January 2026 18:36:33 -0500 (0:00:00.193) 0:00:12.781 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Friday 30 January 2026 18:36:34 -0500 (0:00:00.608) 0:00:13.390 ******** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Friday 30 January 2026 18:36:40 -0500 (0:00:06.222) 0:00:19.612 ******** ok: [managed-node2] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Friday 30 January 2026 18:36:40 -0500 (0:00:00.311) 0:00:19.924 ******** ok: [managed-node2] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Friday 30 January 2026 18:36:41 -0500 (0:00:00.395) 0:00:20.319 ******** ok: [managed-node2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Friday 30 January 2026 18:36:44 -0500 (0:00:03.711) 0:00:24.030 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Friday 30 January 2026 18:36:45 -0500 (0:00:00.635) 0:00:24.666 ******** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Friday 30 January 2026 18:36:45 -0500 (0:00:00.159) 0:00:24.825 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Friday 30 January 2026 18:36:45 -0500 (0:00:00.301) 0:00:25.127 ******** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Friday 30 January 2026 18:36:45 -0500 (0:00:00.127) 0:00:25.254 ******** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Friday 30 January 2026 18:36:49 -0500 (0:00:03.826) 0:00:29.081 ******** ok: [managed-node2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Friday 30 January 2026 18:36:53 -0500 (0:00:03.980) 0:00:33.061 ******** ok: [managed-node2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Friday 30 January 2026 18:36:54 -0500 (0:00:00.250) 0:00:33.312 ******** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Friday 30 January 2026 18:36:54 -0500 (0:00:00.134) 0:00:33.447 ******** ok: [managed-node2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 Friday 30 January 2026 18:36:56 -0500 (0:00:01.996) 0:00:35.443 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Friday 30 January 2026 18:36:56 -0500 (0:00:00.206) 0:00:35.650 ******** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1769815883.7027411, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "ab8070345774adad92683e9645714452be7be474", "ctime": 1769815881.9577456, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 322961545, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1769815881.9577456, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1343, "uid": 0, "version": "3166550135", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:97 Friday 30 January 2026 18:36:58 -0500 (0:00:01.672) 0:00:37.323 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Friday 30 January 2026 18:36:58 -0500 (0:00:00.277) 0:00:37.600 ******** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Friday 30 January 2026 18:36:58 -0500 (0:00:00.151) 0:00:37.752 ******** ok: [managed-node2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:130 Friday 30 January 2026 18:36:58 -0500 (0:00:00.274) 0:00:38.027 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:134 Friday 30 January 2026 18:36:58 -0500 (0:00:00.250) 0:00:38.277 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:150 Friday 30 January 2026 18:36:59 -0500 (0:00:00.227) 0:00:38.505 ******** TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Friday 30 January 2026 18:36:59 -0500 (0:00:00.223) 0:00:38.728 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166 Friday 30 January 2026 18:36:59 -0500 (0:00:00.273) 0:00:39.002 ******** TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:177 Friday 30 January 2026 18:36:59 -0500 (0:00:00.202) 0:00:39.205 ******** TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:189 Friday 30 January 2026 18:37:00 -0500 (0:00:00.250) 0:00:39.456 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 Friday 30 January 2026 18:37:00 -0500 (0:00:00.237) 0:00:39.693 ******** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1769815094.988674, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1716968941.893, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 135, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1716968586.525, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "1157759751", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:202 Friday 30 January 2026 18:37:02 -0500 (0:00:01.632) 0:00:41.326 ******** TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 Friday 30 January 2026 18:37:02 -0500 (0:00:00.168) 0:00:41.494 ******** ok: [managed-node2] TASK [Get unused disks] ******************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:76 Friday 30 January 2026 18:37:04 -0500 (0:00:01.839) 0:00:43.334 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml for managed-node2 TASK [Ensure test packages] **************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:2 Friday 30 January 2026 18:37:04 -0500 (0:00:00.241) 0:00:43.576 ******** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Find unused disks in the system] ***************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:11 Friday 30 January 2026 18:37:08 -0500 (0:00:04.086) 0:00:47.663 ******** ok: [managed-node2] => { "changed": false, "disks": [ "sda" ], "info": [ "Line: NAME=\"/dev/sda\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdb\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdc\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdd\" TYPE=\"disk\" SIZE=\"1099511627776\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sde\" TYPE=\"disk\" SIZE=\"1099511627776\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdf\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdg\" TYPE=\"disk\" SIZE=\"1099511627776\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdh\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdi\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/xvda\" TYPE=\"disk\" SIZE=\"268435456000\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/xvda1\" TYPE=\"part\" SIZE=\"268434390528\" FSTYPE=\"xfs\" LOG-SEC=\"512\"", "Line type [part] is not disk: NAME=\"/dev/xvda1\" TYPE=\"part\" SIZE=\"268434390528\" FSTYPE=\"xfs\" LOG-SEC=\"512\"", "filename [xvda1] is a partition", "Disk [/dev/xvda] attrs [{'type': 'disk', 'size': '268435456000', 'fstype': '', 'ssize': '512'}] has partitions" ] } TASK [Debug why there are no unused disks] ************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:20 Friday 30 January 2026 18:37:10 -0500 (0:00:02.076) 0:00:49.739 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set unused_disks if necessary] ******************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:29 Friday 30 January 2026 18:37:10 -0500 (0:00:00.227) 0:00:49.967 ******** ok: [managed-node2] => { "ansible_facts": { "unused_disks": [ "sda" ] }, "changed": false } TASK [Exit playbook when there's not enough unused disks in the system] ******** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:34 Friday 30 January 2026 18:37:10 -0500 (0:00:00.234) 0:00:50.201 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print unused disks] ****************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:39 Friday 30 January 2026 18:37:11 -0500 (0:00:00.196) 0:00:50.397 ******** ok: [managed-node2] => { "unused_disks": [ "sda" ] } TASK [Test for correct handling of new encrypted volume w/ no key] ************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:85 Friday 30 January 2026 18:37:11 -0500 (0:00:00.160) 0:00:50.558 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node2 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Friday 30 January 2026 18:37:11 -0500 (0:00:00.263) 0:00:50.821 ******** ok: [managed-node2] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Friday 30 January 2026 18:37:11 -0500 (0:00:00.286) 0:00:51.107 ******** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Friday 30 January 2026 18:37:12 -0500 (0:00:00.376) 0:00:51.484 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Friday 30 January 2026 18:37:12 -0500 (0:00:00.288) 0:00:51.772 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Friday 30 January 2026 18:37:12 -0500 (0:00:00.279) 0:00:52.051 ******** skipping: [managed-node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node2] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node2] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Friday 30 January 2026 18:37:13 -0500 (0:00:00.622) 0:00:52.674 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Friday 30 January 2026 18:37:13 -0500 (0:00:00.306) 0:00:52.980 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Friday 30 January 2026 18:37:13 -0500 (0:00:00.176) 0:00:53.156 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Friday 30 January 2026 18:37:14 -0500 (0:00:00.222) 0:00:53.379 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Friday 30 January 2026 18:37:14 -0500 (0:00:00.124) 0:00:53.503 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Friday 30 January 2026 18:37:14 -0500 (0:00:00.235) 0:00:53.739 ******** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Friday 30 January 2026 18:37:18 -0500 (0:00:04.258) 0:00:57.997 ******** ok: [managed-node2] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Friday 30 January 2026 18:37:18 -0500 (0:00:00.207) 0:00:58.204 ******** ok: [managed-node2] => { "storage_volumes | d([])": [ { "disks": [ "sda" ], "encryption": true, "encryption_luks_version": "luks2", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Friday 30 January 2026 18:37:19 -0500 (0:00:00.300) 0:00:58.505 ******** ok: [managed-node2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Friday 30 January 2026 18:37:24 -0500 (0:00:05.037) 0:01:03.543 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Friday 30 January 2026 18:37:24 -0500 (0:00:00.227) 0:01:03.771 ******** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Friday 30 January 2026 18:37:24 -0500 (0:00:00.170) 0:01:03.941 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Friday 30 January 2026 18:37:24 -0500 (0:00:00.298) 0:01:04.240 ******** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Friday 30 January 2026 18:37:25 -0500 (0:00:00.137) 0:01:04.378 ******** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Friday 30 January 2026 18:37:29 -0500 (0:00:04.408) 0:01:08.786 ******** ok: [managed-node2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Friday 30 January 2026 18:37:32 -0500 (0:00:02.815) 0:01:11.602 ******** ok: [managed-node2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Friday 30 January 2026 18:37:32 -0500 (0:00:00.316) 0:01:11.919 ******** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Friday 30 January 2026 18:37:32 -0500 (0:00:00.226) 0:01:12.145 ******** fatal: [managed-node2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: encrypted volume 'foo' missing key/password TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111 Friday 30 January 2026 18:37:38 -0500 (0:00:05.468) 0:01:17.613 ******** fatal: [managed-node2]: FAILED! => { "changed": false } MSG: {'msg': "encrypted volume 'foo' missing key/password", 'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'invocation': {'module_args': {'pools': [], 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'foo', 'raid_level': None, 'size': 10737418240, 'state': 'present', 'type': 'disk', 'disks': ['sda'], 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_stripe_size': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': True, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Friday 30 January 2026 18:37:38 -0500 (0:00:00.188) 0:01:17.802 ******** TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:23 Friday 30 January 2026 18:37:38 -0500 (0:00:00.232) 0:01:18.034 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:28 Friday 30 January 2026 18:37:38 -0500 (0:00:00.179) 0:01:18.214 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:39 Friday 30 January 2026 18:37:39 -0500 (0:00:00.263) 0:01:18.477 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Create an encrypted disk volume w/ default fs] *************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:101 Friday 30 January 2026 18:37:39 -0500 (0:00:00.146) 0:01:18.623 ******** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Friday 30 January 2026 18:37:39 -0500 (0:00:00.317) 0:01:18.941 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Friday 30 January 2026 18:37:39 -0500 (0:00:00.224) 0:01:19.208 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Friday 30 January 2026 18:37:40 -0500 (0:00:00.191) 0:01:19.400 ******** skipping: [managed-node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node2] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node2] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Friday 30 January 2026 18:37:40 -0500 (0:00:00.253) 0:01:19.654 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Friday 30 January 2026 18:37:40 -0500 (0:00:00.105) 0:01:19.759 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Friday 30 January 2026 18:37:40 -0500 (0:00:00.160) 0:01:19.920 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Friday 30 January 2026 18:37:40 -0500 (0:00:00.116) 0:01:20.036 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Friday 30 January 2026 18:37:40 -0500 (0:00:00.060) 0:01:20.097 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Friday 30 January 2026 18:37:41 -0500 (0:00:00.228) 0:01:20.325 ******** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Friday 30 January 2026 18:37:44 -0500 (0:00:03.632) 0:01:23.958 ******** ok: [managed-node2] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Friday 30 January 2026 18:37:44 -0500 (0:00:00.192) 0:01:24.150 ******** ok: [managed-node2] => { "storage_volumes | d([])": [ { "disks": [ "sda" ], "encryption": true, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Friday 30 January 2026 18:37:45 -0500 (0:00:00.211) 0:01:24.362 ******** ok: [managed-node2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Friday 30 January 2026 18:37:49 -0500 (0:00:04.762) 0:01:29.124 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Friday 30 January 2026 18:37:50 -0500 (0:00:00.304) 0:01:29.428 ******** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Friday 30 January 2026 18:37:50 -0500 (0:00:00.145) 0:01:29.574 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Friday 30 January 2026 18:37:50 -0500 (0:00:00.162) 0:01:29.736 ******** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Friday 30 January 2026 18:37:50 -0500 (0:00:00.167) 0:01:29.903 ******** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Friday 30 January 2026 18:37:55 -0500 (0:00:04.508) 0:01:34.412 ******** ok: [managed-node2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Friday 30 January 2026 18:37:57 -0500 (0:00:02.633) 0:01:37.045 ******** ok: [managed-node2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Friday 30 January 2026 18:37:57 -0500 (0:00:00.238) 0:01:37.284 ******** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Friday 30 January 2026 18:37:58 -0500 (0:00:00.135) 0:01:37.419 ******** changed: [managed-node2] => { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-63563baa-e47a-4110-ace5-a19589f17bad", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-63563baa-e47a-4110-ace5-a19589f17bad", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-63563baa-e47a-4110-ace5-a19589f17bad", "password": "-", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-63563baa-e47a-4110-ace5-a19589f17bad" ], "mounts": [ { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-63563baa-e47a-4110-ace5-a19589f17bad", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/mapper/luks-63563baa-e47a-4110-ace5-a19589f17bad", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-63563baa-e47a-4110-ace5-a19589f17bad", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 Friday 30 January 2026 18:38:11 -0500 (0:00:13.354) 0:01:50.774 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Friday 30 January 2026 18:38:11 -0500 (0:00:00.229) 0:01:51.003 ******** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1769815883.7027411, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "ab8070345774adad92683e9645714452be7be474", "ctime": 1769815881.9577456, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 322961545, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1769815881.9577456, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1343, "uid": 0, "version": "3166550135", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:97 Friday 30 January 2026 18:38:13 -0500 (0:00:01.503) 0:01:52.507 ******** ok: [managed-node2] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Friday 30 January 2026 18:38:15 -0500 (0:00:02.695) 0:01:55.203 ******** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Friday 30 January 2026 18:38:16 -0500 (0:00:00.207) 0:01:55.410 ******** ok: [managed-node2] => { "blivet_output": { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-63563baa-e47a-4110-ace5-a19589f17bad", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-63563baa-e47a-4110-ace5-a19589f17bad", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-63563baa-e47a-4110-ace5-a19589f17bad", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-63563baa-e47a-4110-ace5-a19589f17bad" ], "mounts": [ { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-63563baa-e47a-4110-ace5-a19589f17bad", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/mapper/luks-63563baa-e47a-4110-ace5-a19589f17bad", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-63563baa-e47a-4110-ace5-a19589f17bad", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:130 Friday 30 January 2026 18:38:16 -0500 (0:00:00.208) 0:01:55.619 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:134 Friday 30 January 2026 18:38:16 -0500 (0:00:00.188) 0:01:55.807 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/mapper/luks-63563baa-e47a-4110-ace5-a19589f17bad", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-63563baa-e47a-4110-ace5-a19589f17bad", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:150 Friday 30 January 2026 18:38:16 -0500 (0:00:00.186) 0:01:55.994 ******** TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Friday 30 January 2026 18:38:16 -0500 (0:00:00.193) 0:01:56.188 ******** ok: [managed-node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166 Friday 30 January 2026 18:38:20 -0500 (0:00:03.833) 0:02:00.022 ******** changed: [managed-node2] => (item={'src': '/dev/mapper/luks-63563baa-e47a-4110-ace5-a19589f17bad', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-63563baa-e47a-4110-ace5-a19589f17bad", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-63563baa-e47a-4110-ace5-a19589f17bad" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:177 Friday 30 January 2026 18:38:22 -0500 (0:00:02.125) 0:02:02.148 ******** skipping: [managed-node2] => (item={'src': '/dev/mapper/luks-63563baa-e47a-4110-ace5-a19589f17bad', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-63563baa-e47a-4110-ace5-a19589f17bad", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:189 Friday 30 January 2026 18:38:23 -0500 (0:00:00.257) 0:02:02.405 ******** ok: [managed-node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 Friday 30 January 2026 18:38:24 -0500 (0:00:01.706) 0:02:04.112 ******** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1769815094.988674, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1716968941.893, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 135, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1716968586.525, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "1157759751", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:202 Friday 30 January 2026 18:38:25 -0500 (0:00:01.021) 0:02:05.134 ******** changed: [managed-node2] => (item={'backing_device': '/dev/sda', 'name': 'luks-63563baa-e47a-4110-ace5-a19589f17bad', 'password': '-', 'state': 'present'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda", "name": "luks-63563baa-e47a-4110-ace5-a19589f17bad", "password": "-", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 Friday 30 January 2026 18:38:27 -0500 (0:00:01.187) 0:02:06.321 ******** ok: [managed-node2] TASK [Verify role results] ***************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:114 Friday 30 January 2026 18:38:28 -0500 (0:00:01.763) 0:02:08.085 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node2 TASK [Print out pool information] ********************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Friday 30 January 2026 18:38:29 -0500 (0:00:00.382) 0:02:08.467 ******** skipping: [managed-node2] => {} TASK [Print out volume information] ******************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Friday 30 January 2026 18:38:29 -0500 (0:00:00.112) 0:02:08.580 ******** ok: [managed-node2] => { "_storage_volumes_list": [ { "_device": "/dev/mapper/luks-63563baa-e47a-4110-ace5-a19589f17bad", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-63563baa-e47a-4110-ace5-a19589f17bad", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Friday 30 January 2026 18:38:29 -0500 (0:00:00.194) 0:02:08.775 ******** ok: [managed-node2] => { "changed": false, "info": { "/dev/mapper/luks-63563baa-e47a-4110-ace5-a19589f17bad": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-63563baa-e47a-4110-ace5-a19589f17bad", "size": "10G", "type": "crypt", "uuid": "97720b5f-0f12-4993-8a08-b19dae0fe6a4" }, "/dev/sda": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "63563baa-e47a-4110-ace5-a19589f17bad" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Friday 30 January 2026 18:38:32 -0500 (0:00:02.863) 0:02:11.638 ******** ok: [managed-node2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002605", "end": "2026-01-30 18:38:35.178297", "rc": 0, "start": "2026-01-30 18:38:35.175692" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-63563baa-e47a-4110-ace5-a19589f17bad /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Friday 30 January 2026 18:38:35 -0500 (0:00:03.044) 0:02:14.682 ******** ok: [managed-node2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002588", "end": "2026-01-30 18:38:36.560983", "failed_when_result": false, "rc": 0, "start": "2026-01-30 18:38:36.558395" } STDOUT: luks-63563baa-e47a-4110-ace5-a19589f17bad /dev/sda - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Friday 30 January 2026 18:38:36 -0500 (0:00:01.410) 0:02:16.092 ******** TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Friday 30 January 2026 18:38:36 -0500 (0:00:00.137) 0:02:16.230 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node2 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Friday 30 January 2026 18:38:37 -0500 (0:00:00.388) 0:02:16.619 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Friday 30 January 2026 18:38:37 -0500 (0:00:00.221) 0:02:16.840 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node2 included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node2 included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node2 included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node2 included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node2 included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node2 included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node2 included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Friday 30 January 2026 18:38:38 -0500 (0:00:01.136) 0:02:17.977 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-63563baa-e47a-4110-ace5-a19589f17bad" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Friday 30 January 2026 18:38:39 -0500 (0:00:00.332) 0:02:18.309 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Friday 30 January 2026 18:38:39 -0500 (0:00:00.683) 0:02:18.993 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Friday 30 January 2026 18:38:40 -0500 (0:00:00.391) 0:02:19.384 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Friday 30 January 2026 18:38:40 -0500 (0:00:00.246) 0:02:19.631 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Friday 30 January 2026 18:38:40 -0500 (0:00:00.164) 0:02:19.795 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Friday 30 January 2026 18:38:40 -0500 (0:00:00.298) 0:02:20.093 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Friday 30 January 2026 18:38:41 -0500 (0:00:00.317) 0:02:20.411 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Friday 30 January 2026 18:38:41 -0500 (0:00:00.206) 0:02:20.617 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Friday 30 January 2026 18:38:41 -0500 (0:00:00.249) 0:02:20.866 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Friday 30 January 2026 18:38:41 -0500 (0:00:00.286) 0:02:21.152 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Friday 30 January 2026 18:38:41 -0500 (0:00:00.083) 0:02:21.236 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-63563baa-e47a-4110-ace5-a19589f17bad " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Friday 30 January 2026 18:38:42 -0500 (0:00:00.528) 0:02:21.765 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Friday 30 January 2026 18:38:42 -0500 (0:00:00.245) 0:02:22.010 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Friday 30 January 2026 18:38:42 -0500 (0:00:00.235) 0:02:22.245 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Friday 30 January 2026 18:38:43 -0500 (0:00:00.285) 0:02:22.531 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Friday 30 January 2026 18:38:43 -0500 (0:00:00.328) 0:02:22.859 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Friday 30 January 2026 18:38:43 -0500 (0:00:00.156) 0:02:23.016 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Friday 30 January 2026 18:38:44 -0500 (0:00:00.377) 0:02:23.393 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Friday 30 January 2026 18:38:44 -0500 (0:00:00.309) 0:02:23.703 ******** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1769816291.073704, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1769816291.073704, "dev": 6, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 35654, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1769816291.073704, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Friday 30 January 2026 18:38:46 -0500 (0:00:01.605) 0:02:25.309 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Friday 30 January 2026 18:38:46 -0500 (0:00:00.305) 0:02:25.615 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Friday 30 January 2026 18:38:46 -0500 (0:00:00.212) 0:02:25.827 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Friday 30 January 2026 18:38:46 -0500 (0:00:00.213) 0:02:26.041 ******** ok: [managed-node2] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Friday 30 January 2026 18:38:46 -0500 (0:00:00.200) 0:02:26.241 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Friday 30 January 2026 18:38:47 -0500 (0:00:00.205) 0:02:26.446 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Friday 30 January 2026 18:38:47 -0500 (0:00:00.217) 0:02:26.664 ******** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1769816291.1917036, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1769816291.1917036, "dev": 6, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 136739, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1769816291.1917036, "nlink": 1, "path": "/dev/mapper/luks-63563baa-e47a-4110-ace5-a19589f17bad", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Friday 30 January 2026 18:38:48 -0500 (0:00:00.825) 0:02:27.490 ******** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Friday 30 January 2026 18:38:51 -0500 (0:00:03.718) 0:02:31.208 ******** ok: [managed-node2] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/sda" ], "delta": "0:00:00.012705", "end": "2026-01-30 18:38:53.241951", "rc": 0, "start": "2026-01-30 18:38:53.229246" } STDOUT: LUKS header information Version: 2 Epoch: 3 Metadata area: 16384 [bytes] Keyslots area: 16744448 [bytes] UUID: 63563baa-e47a-4110-ace5-a19589f17bad Label: (no label) Subsystem: (no subsystem) Flags: (no flags) Data segments: 0: crypt offset: 16777216 [bytes] length: (whole device) cipher: aes-xts-plain64 sector: 512 [bytes] Keyslots: 0: luks2 Key: 512 bits Priority: normal Cipher: aes-xts-plain64 Cipher key: 512 bits PBKDF: argon2i Time cost: 4 Memory: 905789 Threads: 2 Salt: 1f 33 9a ba 43 de 08 17 f8 38 57 35 2e b1 69 df b7 8c 12 0a 13 2c f2 a8 76 ff 33 ae fa 80 0e ae AF stripes: 4000 AF hash: sha256 Area offset:32768 [bytes] Area length:258048 [bytes] Digest ID: 0 Tokens: Digests: 0: pbkdf2 Hash: sha256 Iterations: 120249 Salt: a3 b6 a4 f5 9e 9d d3 3f 9a 0a ac 75 15 d9 14 70 09 55 4d b1 e8 82 36 66 a1 1b a5 ee 60 e6 47 9b Digest: df b3 2e 1c a8 a5 0a 79 24 d1 2b 39 9a b3 db c4 f6 51 6e 50 7a 9a 00 a1 ed eb 87 47 6a 9b a7 83 TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Friday 30 January 2026 18:38:53 -0500 (0:00:01.572) 0:02:32.781 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Friday 30 January 2026 18:38:53 -0500 (0:00:00.275) 0:02:33.056 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Friday 30 January 2026 18:38:53 -0500 (0:00:00.216) 0:02:33.272 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Friday 30 January 2026 18:38:54 -0500 (0:00:00.158) 0:02:33.431 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Friday 30 January 2026 18:38:54 -0500 (0:00:00.193) 0:02:33.624 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Friday 30 January 2026 18:38:54 -0500 (0:00:00.362) 0:02:33.987 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Friday 30 January 2026 18:38:54 -0500 (0:00:00.172) 0:02:34.160 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Friday 30 January 2026 18:38:54 -0500 (0:00:00.122) 0:02:34.282 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-63563baa-e47a-4110-ace5-a19589f17bad /dev/sda -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Friday 30 January 2026 18:38:55 -0500 (0:00:00.221) 0:02:34.503 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Friday 30 January 2026 18:38:55 -0500 (0:00:00.166) 0:02:34.670 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Friday 30 January 2026 18:38:55 -0500 (0:00:00.299) 0:02:34.970 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Friday 30 January 2026 18:38:55 -0500 (0:00:00.190) 0:02:35.160 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Friday 30 January 2026 18:38:56 -0500 (0:00:00.182) 0:02:35.343 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Friday 30 January 2026 18:38:56 -0500 (0:00:00.146) 0:02:35.490 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Friday 30 January 2026 18:38:56 -0500 (0:00:00.164) 0:02:35.654 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Friday 30 January 2026 18:38:56 -0500 (0:00:00.190) 0:02:35.845 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Friday 30 January 2026 18:38:56 -0500 (0:00:00.177) 0:02:36.022 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Friday 30 January 2026 18:38:56 -0500 (0:00:00.190) 0:02:36.213 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Friday 30 January 2026 18:38:57 -0500 (0:00:00.118) 0:02:36.331 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Friday 30 January 2026 18:38:57 -0500 (0:00:00.126) 0:02:36.457 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Friday 30 January 2026 18:38:57 -0500 (0:00:00.162) 0:02:36.620 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Friday 30 January 2026 18:38:57 -0500 (0:00:00.126) 0:02:36.746 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Friday 30 January 2026 18:38:57 -0500 (0:00:00.141) 0:02:36.888 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Friday 30 January 2026 18:38:57 -0500 (0:00:00.228) 0:02:37.116 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Friday 30 January 2026 18:38:58 -0500 (0:00:00.238) 0:02:37.355 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Friday 30 January 2026 18:38:58 -0500 (0:00:00.239) 0:02:37.594 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Friday 30 January 2026 18:38:58 -0500 (0:00:00.214) 0:02:37.809 ******** ok: [managed-node2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Friday 30 January 2026 18:38:58 -0500 (0:00:00.295) 0:02:38.105 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Friday 30 January 2026 18:38:58 -0500 (0:00:00.164) 0:02:38.269 ******** skipping: [managed-node2] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Friday 30 January 2026 18:38:59 -0500 (0:00:00.268) 0:02:38.538 ******** skipping: [managed-node2] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Friday 30 January 2026 18:38:59 -0500 (0:00:00.180) 0:02:38.718 ******** skipping: [managed-node2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Friday 30 January 2026 18:38:59 -0500 (0:00:00.252) 0:02:38.971 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Friday 30 January 2026 18:38:59 -0500 (0:00:00.151) 0:02:39.122 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Friday 30 January 2026 18:39:00 -0500 (0:00:00.192) 0:02:39.314 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Friday 30 January 2026 18:39:00 -0500 (0:00:00.182) 0:02:39.497 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Friday 30 January 2026 18:39:00 -0500 (0:00:00.233) 0:02:39.731 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Friday 30 January 2026 18:39:00 -0500 (0:00:00.193) 0:02:39.924 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Friday 30 January 2026 18:39:00 -0500 (0:00:00.228) 0:02:40.153 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Friday 30 January 2026 18:39:01 -0500 (0:00:00.203) 0:02:40.356 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Friday 30 January 2026 18:39:01 -0500 (0:00:00.242) 0:02:40.599 ******** skipping: [managed-node2] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Friday 30 January 2026 18:39:01 -0500 (0:00:00.166) 0:02:40.765 ******** skipping: [managed-node2] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Friday 30 January 2026 18:39:01 -0500 (0:00:00.223) 0:02:40.989 ******** skipping: [managed-node2] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Friday 30 January 2026 18:39:01 -0500 (0:00:00.187) 0:02:41.177 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Friday 30 January 2026 18:39:02 -0500 (0:00:00.126) 0:02:41.303 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Friday 30 January 2026 18:39:02 -0500 (0:00:00.257) 0:02:41.561 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Friday 30 January 2026 18:39:02 -0500 (0:00:00.244) 0:02:41.806 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Friday 30 January 2026 18:39:02 -0500 (0:00:00.253) 0:02:42.059 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Friday 30 January 2026 18:39:02 -0500 (0:00:00.133) 0:02:42.192 ******** ok: [managed-node2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Friday 30 January 2026 18:39:03 -0500 (0:00:00.161) 0:02:42.354 ******** ok: [managed-node2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Friday 30 January 2026 18:39:03 -0500 (0:00:00.251) 0:02:42.606 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Friday 30 January 2026 18:39:03 -0500 (0:00:00.177) 0:02:42.783 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Friday 30 January 2026 18:39:03 -0500 (0:00:00.194) 0:02:42.977 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Friday 30 January 2026 18:39:03 -0500 (0:00:00.221) 0:02:43.199 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Friday 30 January 2026 18:39:04 -0500 (0:00:00.231) 0:02:43.431 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Friday 30 January 2026 18:39:04 -0500 (0:00:00.182) 0:02:43.613 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Friday 30 January 2026 18:39:04 -0500 (0:00:00.262) 0:02:43.876 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Friday 30 January 2026 18:39:04 -0500 (0:00:00.146) 0:02:44.023 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Friday 30 January 2026 18:39:04 -0500 (0:00:00.218) 0:02:44.241 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Friday 30 January 2026 18:39:05 -0500 (0:00:00.131) 0:02:44.373 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Friday 30 January 2026 18:39:05 -0500 (0:00:00.097) 0:02:44.470 ******** changed: [managed-node2] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode] ********************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:120 Friday 30 January 2026 18:39:07 -0500 (0:00:02.681) 0:02:47.152 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node2 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Friday 30 January 2026 18:39:08 -0500 (0:00:00.349) 0:02:47.501 ******** ok: [managed-node2] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Friday 30 January 2026 18:39:08 -0500 (0:00:00.126) 0:02:47.628 ******** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Friday 30 January 2026 18:39:08 -0500 (0:00:00.208) 0:02:47.836 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Friday 30 January 2026 18:39:08 -0500 (0:00:00.294) 0:02:48.131 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Friday 30 January 2026 18:39:09 -0500 (0:00:00.176) 0:02:48.308 ******** skipping: [managed-node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node2] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node2] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Friday 30 January 2026 18:39:09 -0500 (0:00:00.533) 0:02:48.842 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Friday 30 January 2026 18:39:09 -0500 (0:00:00.241) 0:02:49.083 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Friday 30 January 2026 18:39:10 -0500 (0:00:00.232) 0:02:49.316 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Friday 30 January 2026 18:39:10 -0500 (0:00:00.191) 0:02:49.507 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Friday 30 January 2026 18:39:10 -0500 (0:00:00.167) 0:02:49.675 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Friday 30 January 2026 18:39:10 -0500 (0:00:00.371) 0:02:50.046 ******** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Friday 30 January 2026 18:39:15 -0500 (0:00:04.646) 0:02:54.693 ******** ok: [managed-node2] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Friday 30 January 2026 18:39:15 -0500 (0:00:00.104) 0:02:54.798 ******** ok: [managed-node2] => { "storage_volumes | d([])": [ { "disks": [ "sda" ], "encryption": false, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Friday 30 January 2026 18:39:15 -0500 (0:00:00.134) 0:02:54.933 ******** ok: [managed-node2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Friday 30 January 2026 18:39:20 -0500 (0:00:04.700) 0:02:59.634 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Friday 30 January 2026 18:39:20 -0500 (0:00:00.313) 0:02:59.947 ******** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Friday 30 January 2026 18:39:20 -0500 (0:00:00.149) 0:03:00.096 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Friday 30 January 2026 18:39:20 -0500 (0:00:00.183) 0:03:00.280 ******** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Friday 30 January 2026 18:39:21 -0500 (0:00:00.111) 0:03:00.392 ******** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Friday 30 January 2026 18:39:25 -0500 (0:00:04.493) 0:03:04.885 ******** ok: [managed-node2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Friday 30 January 2026 18:39:28 -0500 (0:00:02.566) 0:03:07.451 ******** ok: [managed-node2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Friday 30 January 2026 18:39:28 -0500 (0:00:00.420) 0:03:07.872 ******** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Friday 30 January 2026 18:39:28 -0500 (0:00:00.166) 0:03:08.038 ******** fatal: [managed-node2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'luks-63563baa-e47a-4110-ace5-a19589f17bad' in safe mode due to encryption removal TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111 Friday 30 January 2026 18:39:34 -0500 (0:00:05.275) 0:03:13.314 ******** fatal: [managed-node2]: FAILED! => { "changed": false } MSG: {'msg': "cannot remove existing formatting on device 'luks-63563baa-e47a-4110-ace5-a19589f17bad' in safe mode due to encryption removal", 'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'invocation': {'module_args': {'pools': [], 'volumes': [{'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'foo', 'raid_level': None, 'size': 10720641024, 'state': 'present', 'type': 'disk', 'disks': ['sda'], 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_stripe_size': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': True, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Friday 30 January 2026 18:39:34 -0500 (0:00:00.192) 0:03:13.507 ******** TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:23 Friday 30 January 2026 18:39:34 -0500 (0:00:00.121) 0:03:13.629 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:28 Friday 30 January 2026 18:39:34 -0500 (0:00:00.235) 0:03:13.864 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:39 Friday 30 January 2026 18:39:34 -0500 (0:00:00.186) 0:03:14.051 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Friday 30 January 2026 18:39:34 -0500 (0:00:00.127) 0:03:14.178 ******** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1769816347.6575549, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1769816347.6575549, "dev": 64768, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1769816347.6575549, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "3951338914", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Friday 30 January 2026 18:39:36 -0500 (0:00:01.261) 0:03:15.440 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Remove the encryption layer] ********************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:141 Friday 30 January 2026 18:39:36 -0500 (0:00:00.186) 0:03:15.626 ******** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Friday 30 January 2026 18:39:36 -0500 (0:00:00.375) 0:03:16.002 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Friday 30 January 2026 18:39:37 -0500 (0:00:00.315) 0:03:16.317 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Friday 30 January 2026 18:39:37 -0500 (0:00:00.251) 0:03:16.568 ******** skipping: [managed-node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node2] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node2] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Friday 30 January 2026 18:39:37 -0500 (0:00:00.611) 0:03:17.180 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Friday 30 January 2026 18:39:38 -0500 (0:00:00.277) 0:03:17.457 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Friday 30 January 2026 18:39:38 -0500 (0:00:00.190) 0:03:17.647 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Friday 30 January 2026 18:39:38 -0500 (0:00:00.200) 0:03:17.847 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Friday 30 January 2026 18:39:38 -0500 (0:00:00.226) 0:03:18.074 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Friday 30 January 2026 18:39:39 -0500 (0:00:00.408) 0:03:18.482 ******** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Friday 30 January 2026 18:39:42 -0500 (0:00:03.521) 0:03:22.004 ******** ok: [managed-node2] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Friday 30 January 2026 18:39:42 -0500 (0:00:00.143) 0:03:22.148 ******** ok: [managed-node2] => { "storage_volumes | d([])": [ { "disks": [ "sda" ], "encryption": false, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Friday 30 January 2026 18:39:42 -0500 (0:00:00.078) 0:03:22.226 ******** ok: [managed-node2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Friday 30 January 2026 18:39:47 -0500 (0:00:04.983) 0:03:27.210 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Friday 30 January 2026 18:39:48 -0500 (0:00:00.276) 0:03:27.487 ******** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Friday 30 January 2026 18:39:48 -0500 (0:00:00.143) 0:03:27.631 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Friday 30 January 2026 18:39:48 -0500 (0:00:00.164) 0:03:27.796 ******** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Friday 30 January 2026 18:39:48 -0500 (0:00:00.133) 0:03:27.929 ******** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Friday 30 January 2026 18:39:52 -0500 (0:00:04.224) 0:03:32.154 ******** ok: [managed-node2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Friday 30 January 2026 18:39:55 -0500 (0:00:02.641) 0:03:34.796 ******** ok: [managed-node2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Friday 30 January 2026 18:39:55 -0500 (0:00:00.362) 0:03:35.159 ******** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Friday 30 January 2026 18:39:56 -0500 (0:00:00.219) 0:03:35.378 ******** changed: [managed-node2] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-63563baa-e47a-4110-ace5-a19589f17bad", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-63563baa-e47a-4110-ace5-a19589f17bad", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-63563baa-e47a-4110-ace5-a19589f17bad", "password": "-", "state": "absent" } ], "leaves": [ "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-63563baa-e47a-4110-ace5-a19589f17bad", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=8faa4d70-5959-4465-a208-91204d0638f8", "state": "mounted" } ], "packages": [ "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=8faa4d70-5959-4465-a208-91204d0638f8", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10720641024, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 Friday 30 January 2026 18:40:01 -0500 (0:00:05.592) 0:03:40.971 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Friday 30 January 2026 18:40:01 -0500 (0:00:00.170) 0:03:41.141 ******** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1769816302.6366735, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "64f6e1004abc37d6e2ff473fbac25cbfc821e100", "ctime": 1769816302.6326735, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 322961545, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1769816302.6326735, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1425, "uid": 0, "version": "3166550135", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:97 Friday 30 January 2026 18:40:03 -0500 (0:00:01.450) 0:03:42.592 ******** ok: [managed-node2] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Friday 30 January 2026 18:40:04 -0500 (0:00:01.642) 0:03:44.234 ******** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Friday 30 January 2026 18:40:05 -0500 (0:00:00.203) 0:03:44.438 ******** ok: [managed-node2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-63563baa-e47a-4110-ace5-a19589f17bad", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-63563baa-e47a-4110-ace5-a19589f17bad", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-63563baa-e47a-4110-ace5-a19589f17bad", "password": "-", "state": "absent" } ], "failed": false, "leaves": [ "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-63563baa-e47a-4110-ace5-a19589f17bad", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=8faa4d70-5959-4465-a208-91204d0638f8", "state": "mounted" } ], "packages": [ "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=8faa4d70-5959-4465-a208-91204d0638f8", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10720641024, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:130 Friday 30 January 2026 18:40:05 -0500 (0:00:00.269) 0:03:44.707 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:134 Friday 30 January 2026 18:40:05 -0500 (0:00:00.202) 0:03:44.909 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=8faa4d70-5959-4465-a208-91204d0638f8", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10720641024, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:150 Friday 30 January 2026 18:40:05 -0500 (0:00:00.220) 0:03:45.130 ******** changed: [managed-node2] => (item={'src': '/dev/mapper/luks-63563baa-e47a-4110-ace5-a19589f17bad', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-63563baa-e47a-4110-ace5-a19589f17bad", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-63563baa-e47a-4110-ace5-a19589f17bad" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Friday 30 January 2026 18:40:07 -0500 (0:00:01.254) 0:03:46.385 ******** ok: [managed-node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166 Friday 30 January 2026 18:40:08 -0500 (0:00:01.639) 0:03:48.024 ******** changed: [managed-node2] => (item={'src': 'UUID=8faa4d70-5959-4465-a208-91204d0638f8', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=8faa4d70-5959-4465-a208-91204d0638f8", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=8faa4d70-5959-4465-a208-91204d0638f8" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:177 Friday 30 January 2026 18:40:10 -0500 (0:00:01.452) 0:03:49.476 ******** skipping: [managed-node2] => (item={'src': 'UUID=8faa4d70-5959-4465-a208-91204d0638f8', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=8faa4d70-5959-4465-a208-91204d0638f8", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:189 Friday 30 January 2026 18:40:10 -0500 (0:00:00.124) 0:03:49.601 ******** ok: [managed-node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 Friday 30 January 2026 18:40:11 -0500 (0:00:01.461) 0:03:51.062 ******** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1769816316.5596368, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "b5abfe08e45f5401cb6d17925ebba1cef7412616", "ctime": 1769816306.7846625, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 100663493, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1769816306.7836626, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 53, "uid": 0, "version": "1544687531", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:202 Friday 30 January 2026 18:40:13 -0500 (0:00:01.250) 0:03:52.313 ******** changed: [managed-node2] => (item={'backing_device': '/dev/sda', 'name': 'luks-63563baa-e47a-4110-ace5-a19589f17bad', 'password': '-', 'state': 'absent'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda", "name": "luks-63563baa-e47a-4110-ace5-a19589f17bad", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 Friday 30 January 2026 18:40:14 -0500 (0:00:01.614) 0:03:53.928 ******** ok: [managed-node2] TASK [Verify role results - 2] ************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:155 Friday 30 January 2026 18:40:16 -0500 (0:00:01.781) 0:03:55.709 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node2 TASK [Print out pool information] ********************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Friday 30 January 2026 18:40:16 -0500 (0:00:00.399) 0:03:56.109 ******** skipping: [managed-node2] => {} TASK [Print out volume information] ******************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Friday 30 January 2026 18:40:17 -0500 (0:00:00.290) 0:03:56.400 ******** ok: [managed-node2] => { "_storage_volumes_list": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=8faa4d70-5959-4465-a208-91204d0638f8", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10720641024, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Friday 30 January 2026 18:40:17 -0500 (0:00:00.146) 0:03:56.546 ******** ok: [managed-node2] => { "changed": false, "info": { "/dev/sda": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "8faa4d70-5959-4465-a208-91204d0638f8" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Friday 30 January 2026 18:40:18 -0500 (0:00:01.315) 0:03:57.861 ******** ok: [managed-node2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002436", "end": "2026-01-30 18:40:19.641355", "rc": 0, "start": "2026-01-30 18:40:19.638919" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 UUID=8faa4d70-5959-4465-a208-91204d0638f8 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Friday 30 January 2026 18:40:19 -0500 (0:00:01.275) 0:03:59.137 ******** ok: [managed-node2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002551", "end": "2026-01-30 18:40:20.830996", "failed_when_result": false, "rc": 0, "start": "2026-01-30 18:40:20.828445" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Friday 30 January 2026 18:40:21 -0500 (0:00:01.161) 0:04:00.299 ******** TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Friday 30 January 2026 18:40:21 -0500 (0:00:00.149) 0:04:00.449 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node2 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Friday 30 January 2026 18:40:21 -0500 (0:00:00.330) 0:04:00.779 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Friday 30 January 2026 18:40:21 -0500 (0:00:00.189) 0:04:00.968 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node2 included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node2 included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node2 included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node2 included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node2 included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node2 included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node2 included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Friday 30 January 2026 18:40:22 -0500 (0:00:01.059) 0:04:02.028 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_device_path": "/dev/sda" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Friday 30 January 2026 18:40:22 -0500 (0:00:00.250) 0:04:02.278 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Friday 30 January 2026 18:40:23 -0500 (0:00:00.278) 0:04:02.556 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Friday 30 January 2026 18:40:23 -0500 (0:00:00.725) 0:04:03.282 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Friday 30 January 2026 18:40:24 -0500 (0:00:00.192) 0:04:03.475 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Friday 30 January 2026 18:40:24 -0500 (0:00:00.242) 0:04:03.717 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Friday 30 January 2026 18:40:24 -0500 (0:00:00.253) 0:04:03.971 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Friday 30 January 2026 18:40:24 -0500 (0:00:00.247) 0:04:04.219 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Friday 30 January 2026 18:40:25 -0500 (0:00:00.333) 0:04:04.552 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Friday 30 January 2026 18:40:25 -0500 (0:00:00.269) 0:04:04.821 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Friday 30 January 2026 18:40:25 -0500 (0:00:00.286) 0:04:05.107 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Friday 30 January 2026 18:40:26 -0500 (0:00:00.197) 0:04:05.305 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "UUID=8faa4d70-5959-4465-a208-91204d0638f8 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Friday 30 January 2026 18:40:26 -0500 (0:00:00.531) 0:04:05.837 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Friday 30 January 2026 18:40:26 -0500 (0:00:00.292) 0:04:06.129 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Friday 30 January 2026 18:40:27 -0500 (0:00:00.321) 0:04:06.451 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Friday 30 January 2026 18:40:27 -0500 (0:00:00.265) 0:04:06.716 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Friday 30 January 2026 18:40:27 -0500 (0:00:00.330) 0:04:07.046 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Friday 30 January 2026 18:40:27 -0500 (0:00:00.222) 0:04:07.269 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Friday 30 January 2026 18:40:28 -0500 (0:00:00.363) 0:04:07.633 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Friday 30 January 2026 18:40:28 -0500 (0:00:00.354) 0:04:07.987 ******** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1769816401.3964136, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1769816401.3964136, "dev": 6, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 35654, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1769816401.3964136, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Friday 30 January 2026 18:40:30 -0500 (0:00:01.631) 0:04:09.619 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Friday 30 January 2026 18:40:30 -0500 (0:00:00.326) 0:04:09.946 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Friday 30 January 2026 18:40:30 -0500 (0:00:00.258) 0:04:10.204 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Friday 30 January 2026 18:40:31 -0500 (0:00:00.304) 0:04:10.509 ******** ok: [managed-node2] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Friday 30 January 2026 18:40:31 -0500 (0:00:00.309) 0:04:10.818 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Friday 30 January 2026 18:40:31 -0500 (0:00:00.282) 0:04:11.101 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Friday 30 January 2026 18:40:32 -0500 (0:00:00.244) 0:04:11.345 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Friday 30 January 2026 18:40:32 -0500 (0:00:00.289) 0:04:11.634 ******** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Friday 30 January 2026 18:40:36 -0500 (0:00:03.655) 0:04:15.290 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Friday 30 January 2026 18:40:36 -0500 (0:00:00.155) 0:04:15.446 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Friday 30 January 2026 18:40:36 -0500 (0:00:00.175) 0:04:15.621 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Friday 30 January 2026 18:40:36 -0500 (0:00:00.144) 0:04:15.765 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Friday 30 January 2026 18:40:36 -0500 (0:00:00.102) 0:04:15.867 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Friday 30 January 2026 18:40:36 -0500 (0:00:00.170) 0:04:16.038 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Friday 30 January 2026 18:40:36 -0500 (0:00:00.200) 0:04:16.238 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Friday 30 January 2026 18:40:37 -0500 (0:00:00.131) 0:04:16.370 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Friday 30 January 2026 18:40:37 -0500 (0:00:00.213) 0:04:16.583 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Friday 30 January 2026 18:40:37 -0500 (0:00:00.171) 0:04:16.754 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Friday 30 January 2026 18:40:37 -0500 (0:00:00.263) 0:04:17.018 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Friday 30 January 2026 18:40:37 -0500 (0:00:00.256) 0:04:17.275 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Friday 30 January 2026 18:40:38 -0500 (0:00:00.305) 0:04:17.581 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Friday 30 January 2026 18:40:38 -0500 (0:00:00.286) 0:04:17.867 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Friday 30 January 2026 18:40:38 -0500 (0:00:00.149) 0:04:18.017 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Friday 30 January 2026 18:40:38 -0500 (0:00:00.261) 0:04:18.278 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Friday 30 January 2026 18:40:39 -0500 (0:00:00.268) 0:04:18.546 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Friday 30 January 2026 18:40:39 -0500 (0:00:00.197) 0:04:18.743 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Friday 30 January 2026 18:40:39 -0500 (0:00:00.246) 0:04:18.990 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Friday 30 January 2026 18:40:39 -0500 (0:00:00.189) 0:04:19.180 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Friday 30 January 2026 18:40:40 -0500 (0:00:00.179) 0:04:19.359 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Friday 30 January 2026 18:40:40 -0500 (0:00:00.270) 0:04:19.630 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Friday 30 January 2026 18:40:40 -0500 (0:00:00.283) 0:04:19.913 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Friday 30 January 2026 18:40:40 -0500 (0:00:00.314) 0:04:20.228 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Friday 30 January 2026 18:40:41 -0500 (0:00:00.166) 0:04:20.394 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Friday 30 January 2026 18:40:41 -0500 (0:00:00.188) 0:04:20.583 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Friday 30 January 2026 18:40:41 -0500 (0:00:00.151) 0:04:20.734 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Friday 30 January 2026 18:40:41 -0500 (0:00:00.253) 0:04:20.988 ******** ok: [managed-node2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Friday 30 January 2026 18:40:41 -0500 (0:00:00.286) 0:04:21.274 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Friday 30 January 2026 18:40:42 -0500 (0:00:00.208) 0:04:21.483 ******** skipping: [managed-node2] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Friday 30 January 2026 18:40:42 -0500 (0:00:00.214) 0:04:21.697 ******** skipping: [managed-node2] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Friday 30 January 2026 18:40:42 -0500 (0:00:00.153) 0:04:21.851 ******** skipping: [managed-node2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Friday 30 January 2026 18:40:42 -0500 (0:00:00.244) 0:04:22.095 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Friday 30 January 2026 18:40:43 -0500 (0:00:00.215) 0:04:22.311 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Friday 30 January 2026 18:40:43 -0500 (0:00:00.197) 0:04:22.508 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Friday 30 January 2026 18:40:43 -0500 (0:00:00.304) 0:04:22.813 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Friday 30 January 2026 18:40:43 -0500 (0:00:00.181) 0:04:22.995 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Friday 30 January 2026 18:40:43 -0500 (0:00:00.215) 0:04:23.211 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Friday 30 January 2026 18:40:44 -0500 (0:00:00.135) 0:04:23.346 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Friday 30 January 2026 18:40:44 -0500 (0:00:00.139) 0:04:23.486 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Friday 30 January 2026 18:40:44 -0500 (0:00:00.143) 0:04:23.629 ******** skipping: [managed-node2] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Friday 30 January 2026 18:40:44 -0500 (0:00:00.250) 0:04:23.880 ******** skipping: [managed-node2] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Friday 30 January 2026 18:40:44 -0500 (0:00:00.079) 0:04:23.959 ******** skipping: [managed-node2] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Friday 30 January 2026 18:40:44 -0500 (0:00:00.219) 0:04:24.179 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Friday 30 January 2026 18:40:45 -0500 (0:00:00.211) 0:04:24.390 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Friday 30 January 2026 18:40:45 -0500 (0:00:00.294) 0:04:24.684 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Friday 30 January 2026 18:40:45 -0500 (0:00:00.194) 0:04:24.879 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Friday 30 January 2026 18:40:45 -0500 (0:00:00.250) 0:04:25.129 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Friday 30 January 2026 18:40:46 -0500 (0:00:00.262) 0:04:25.392 ******** ok: [managed-node2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Friday 30 January 2026 18:40:46 -0500 (0:00:00.222) 0:04:25.615 ******** ok: [managed-node2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Friday 30 January 2026 18:40:46 -0500 (0:00:00.252) 0:04:25.868 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Friday 30 January 2026 18:40:46 -0500 (0:00:00.182) 0:04:26.050 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Friday 30 January 2026 18:40:46 -0500 (0:00:00.178) 0:04:26.229 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Friday 30 January 2026 18:40:47 -0500 (0:00:00.173) 0:04:26.402 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Friday 30 January 2026 18:40:47 -0500 (0:00:00.185) 0:04:26.588 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Friday 30 January 2026 18:40:47 -0500 (0:00:00.251) 0:04:26.840 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Friday 30 January 2026 18:40:47 -0500 (0:00:00.191) 0:04:27.031 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Friday 30 January 2026 18:40:48 -0500 (0:00:00.305) 0:04:27.337 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Friday 30 January 2026 18:40:48 -0500 (0:00:00.233) 0:04:27.570 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Friday 30 January 2026 18:40:48 -0500 (0:00:00.216) 0:04:27.787 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Friday 30 January 2026 18:40:48 -0500 (0:00:00.167) 0:04:27.954 ******** changed: [managed-node2] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode - 2] ****************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:161 Friday 30 January 2026 18:40:50 -0500 (0:00:01.540) 0:04:29.494 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node2 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Friday 30 January 2026 18:40:50 -0500 (0:00:00.384) 0:04:29.878 ******** ok: [managed-node2] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Friday 30 January 2026 18:40:50 -0500 (0:00:00.225) 0:04:30.103 ******** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Friday 30 January 2026 18:40:51 -0500 (0:00:00.267) 0:04:30.371 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Friday 30 January 2026 18:40:51 -0500 (0:00:00.279) 0:04:30.651 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Friday 30 January 2026 18:40:51 -0500 (0:00:00.225) 0:04:30.876 ******** skipping: [managed-node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node2] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node2] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Friday 30 January 2026 18:40:52 -0500 (0:00:00.550) 0:04:31.426 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Friday 30 January 2026 18:40:52 -0500 (0:00:00.253) 0:04:31.680 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Friday 30 January 2026 18:40:52 -0500 (0:00:00.196) 0:04:31.877 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Friday 30 January 2026 18:40:52 -0500 (0:00:00.123) 0:04:32.000 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Friday 30 January 2026 18:40:53 -0500 (0:00:00.319) 0:04:32.319 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Friday 30 January 2026 18:40:53 -0500 (0:00:00.412) 0:04:32.731 ******** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Friday 30 January 2026 18:40:57 -0500 (0:00:04.002) 0:04:36.734 ******** ok: [managed-node2] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Friday 30 January 2026 18:40:57 -0500 (0:00:00.257) 0:04:36.992 ******** ok: [managed-node2] => { "storage_volumes | d([])": [ { "disks": [ "sda" ], "encryption": true, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Friday 30 January 2026 18:40:57 -0500 (0:00:00.234) 0:04:37.227 ******** ok: [managed-node2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Friday 30 January 2026 18:41:02 -0500 (0:00:04.776) 0:04:42.003 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Friday 30 January 2026 18:41:02 -0500 (0:00:00.273) 0:04:42.277 ******** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Friday 30 January 2026 18:41:03 -0500 (0:00:00.169) 0:04:42.446 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Friday 30 January 2026 18:41:03 -0500 (0:00:00.278) 0:04:42.725 ******** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Friday 30 January 2026 18:41:03 -0500 (0:00:00.183) 0:04:42.908 ******** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Friday 30 January 2026 18:41:07 -0500 (0:00:04.044) 0:04:46.952 ******** ok: [managed-node2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luk...de47a\\x2d4110\\x2dace5\\x2da19589f17bad.service": { "name": "systemd-cryptsetup@luk...de47a\\x2d4110\\x2dace5\\x2da19589f17bad.service", "source": "systemd", "state": "inactive", "status": "generated" }, "systemd-cryptsetup@luks\\x2d63563baa\\x2de47a\\x2d4110\\x2dace5\\x2da19589f17bad.service": { "name": "systemd-cryptsetup@luks\\x2d63563baa\\x2de47a\\x2d4110\\x2dace5\\x2da19589f17bad.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Friday 30 January 2026 18:41:10 -0500 (0:00:02.729) 0:04:49.682 ******** ok: [managed-node2] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2d63563baa\\x2de47a\\x2d4110\\x2dace5\\x2da19589f17bad.service", "systemd-cryptsetup@luk...de47a\\x2d4110\\x2dace5\\x2da19589f17bad.service" ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Friday 30 January 2026 18:41:10 -0500 (0:00:00.312) 0:04:49.995 ******** changed: [managed-node2] => (item=systemd-cryptsetup@luks\x2d63563baa\x2de47a\x2d4110\x2dace5\x2da19589f17bad.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d63563baa\\x2de47a\\x2d4110\\x2dace5\\x2da19589f17bad.service", "name": "systemd-cryptsetup@luks\\x2d63563baa\\x2de47a\\x2d4110\\x2dace5\\x2da19589f17bad.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "dev-sda.device systemd-journald.socket cryptsetup-pre.target system-systemd\\x2dcryptsetup.slice", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "umount.target cryptsetup.target", "BindsTo": "dev-sda.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-63563baa-e47a-4110-ace5-a19589f17bad", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-63563baa-e47a-4110-ace5-a19589f17bad /dev/sda - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-63563baa-e47a-4110-ace5-a19589f17bad ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d63563baa\\x2de47a\\x2d4110\\x2dace5\\x2da19589f17bad.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2d63563baa\\x2de47a\\x2d4110\\x2dace5\\x2da19589f17bad.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2d63563baa\\x2de47a\\x2d4110\\x2dace5\\x2da19589f17bad.service", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Fri 2026-01-30 18:40:11 EST", "StateChangeTimestampMonotonic": "1817579840", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "infinity", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "WantedBy": "dev-sda.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node2] => (item=systemd-cryptsetup@luk...de47a\x2d4110\x2dace5\x2da19589f17bad.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...de47a\\x2d4110\\x2dace5\\x2da19589f17bad.service", "name": "systemd-cryptsetup@luk...de47a\\x2d4110\\x2dace5\\x2da19589f17bad.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...de47a\\x2d4110\\x2dace5\\x2da19589f17bad.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...de47a\\x2d4110\\x2dace5\\x2da19589f17bad.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.NoSuchUnit \"Unit systemd-cryptsetup@luk...de47a\\x2d4110\\x2dace5\\x2da19589f17bad.service not found.\"", "LoadState": "not-found", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...de47a\\x2d4110\\x2dace5\\x2da19589f17bad.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Friday 30 January 2026 18:41:14 -0500 (0:00:03.417) 0:04:53.412 ******** fatal: [managed-node2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'sda' in safe mode due to adding encryption TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111 Friday 30 January 2026 18:41:19 -0500 (0:00:05.196) 0:04:58.609 ******** fatal: [managed-node2]: FAILED! => { "changed": false } MSG: {'msg': "cannot remove existing formatting on device 'sda' in safe mode due to adding encryption", 'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'invocation': {'module_args': {'pools': [], 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'foo', 'raid_level': None, 'size': 10737418240, 'state': 'present', 'type': 'disk', 'disks': ['sda'], 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_stripe_size': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': True, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Friday 30 January 2026 18:41:19 -0500 (0:00:00.273) 0:04:58.883 ******** changed: [managed-node2] => (item=systemd-cryptsetup@luks\x2d63563baa\x2de47a\x2d4110\x2dace5\x2da19589f17bad.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d63563baa\\x2de47a\\x2d4110\\x2dace5\\x2da19589f17bad.service", "name": "systemd-cryptsetup@luks\\x2d63563baa\\x2de47a\\x2d4110\\x2dace5\\x2da19589f17bad.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d63563baa\\x2de47a\\x2d4110\\x2dace5\\x2da19589f17bad.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2d63563baa\\x2de47a\\x2d4110\\x2dace5\\x2da19589f17bad.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2d63563baa\\x2de47a\\x2d4110\\x2dace5\\x2da19589f17bad.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2d63563baa\\x2de47a\\x2d4110\\x2dace5\\x2da19589f17bad.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node2] => (item=systemd-cryptsetup@luk...de47a\x2d4110\x2dace5\x2da19589f17bad.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...de47a\\x2d4110\\x2dace5\\x2da19589f17bad.service", "name": "systemd-cryptsetup@luk...de47a\\x2d4110\\x2dace5\\x2da19589f17bad.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...de47a\\x2d4110\\x2dace5\\x2da19589f17bad.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...de47a\\x2d4110\\x2dace5\\x2da19589f17bad.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luk...de47a\\x2d4110\\x2dace5\\x2da19589f17bad.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...de47a\\x2d4110\\x2dace5\\x2da19589f17bad.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:23 Friday 30 January 2026 18:41:23 -0500 (0:00:03.432) 0:05:02.315 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:28 Friday 30 January 2026 18:41:23 -0500 (0:00:00.186) 0:05:02.502 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:39 Friday 30 January 2026 18:41:23 -0500 (0:00:00.332) 0:05:02.834 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Friday 30 January 2026 18:41:23 -0500 (0:00:00.219) 0:05:03.054 ******** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1769816449.9642856, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1769816449.9642856, "dev": 2048, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1769816449.9642856, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "2961731997", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Friday 30 January 2026 18:41:25 -0500 (0:00:01.512) 0:05:04.566 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Add encryption to the volume] ******************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:182 Friday 30 January 2026 18:41:25 -0500 (0:00:00.224) 0:05:04.790 ******** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Friday 30 January 2026 18:41:26 -0500 (0:00:00.648) 0:05:05.439 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Friday 30 January 2026 18:41:26 -0500 (0:00:00.213) 0:05:05.652 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Friday 30 January 2026 18:41:26 -0500 (0:00:00.232) 0:05:05.885 ******** skipping: [managed-node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node2] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node2] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Friday 30 January 2026 18:41:27 -0500 (0:00:00.464) 0:05:06.349 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Friday 30 January 2026 18:41:27 -0500 (0:00:00.210) 0:05:06.560 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Friday 30 January 2026 18:41:27 -0500 (0:00:00.230) 0:05:06.790 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Friday 30 January 2026 18:41:27 -0500 (0:00:00.185) 0:05:06.976 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Friday 30 January 2026 18:41:27 -0500 (0:00:00.162) 0:05:07.138 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Friday 30 January 2026 18:41:28 -0500 (0:00:00.443) 0:05:07.582 ******** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Friday 30 January 2026 18:41:32 -0500 (0:00:04.091) 0:05:11.673 ******** ok: [managed-node2] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Friday 30 January 2026 18:41:32 -0500 (0:00:00.167) 0:05:11.840 ******** ok: [managed-node2] => { "storage_volumes | d([])": [ { "disks": [ "sda" ], "encryption": true, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Friday 30 January 2026 18:41:32 -0500 (0:00:00.263) 0:05:12.104 ******** ok: [managed-node2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Friday 30 January 2026 18:41:38 -0500 (0:00:05.192) 0:05:17.296 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Friday 30 January 2026 18:41:38 -0500 (0:00:00.702) 0:05:17.999 ******** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Friday 30 January 2026 18:41:38 -0500 (0:00:00.138) 0:05:18.137 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Friday 30 January 2026 18:41:39 -0500 (0:00:00.173) 0:05:18.310 ******** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Friday 30 January 2026 18:41:39 -0500 (0:00:00.244) 0:05:18.555 ******** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Friday 30 January 2026 18:41:43 -0500 (0:00:03.983) 0:05:22.539 ******** ok: [managed-node2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Friday 30 January 2026 18:41:46 -0500 (0:00:02.784) 0:05:25.323 ******** ok: [managed-node2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Friday 30 January 2026 18:41:46 -0500 (0:00:00.329) 0:05:25.653 ******** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Friday 30 January 2026 18:41:46 -0500 (0:00:00.122) 0:05:25.775 ******** changed: [managed-node2] => { "actions": [ { "action": "destroy format", "device": "/dev/sda", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-69ed5d4e-e0f6-4165-8685-9d059372f9b9", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-69ed5d4e-e0f6-4165-8685-9d059372f9b9", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-69ed5d4e-e0f6-4165-8685-9d059372f9b9", "password": "-", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-69ed5d4e-e0f6-4165-8685-9d059372f9b9" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=8faa4d70-5959-4465-a208-91204d0638f8", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-69ed5d4e-e0f6-4165-8685-9d059372f9b9", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/mapper/luks-69ed5d4e-e0f6-4165-8685-9d059372f9b9", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-69ed5d4e-e0f6-4165-8685-9d059372f9b9", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 Friday 30 January 2026 18:42:00 -0500 (0:00:13.613) 0:05:39.388 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Friday 30 January 2026 18:42:00 -0500 (0:00:00.168) 0:05:39.557 ******** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1769816410.0133908, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "203e472e463fa22106e8bd5b6155b54cc8021340", "ctime": 1769816410.0103908, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 322961545, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1769816410.0103908, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1413, "uid": 0, "version": "3166550135", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:97 Friday 30 January 2026 18:42:01 -0500 (0:00:01.220) 0:05:40.778 ******** ok: [managed-node2] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Friday 30 January 2026 18:42:02 -0500 (0:00:01.281) 0:05:42.059 ******** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Friday 30 January 2026 18:42:02 -0500 (0:00:00.116) 0:05:42.176 ******** ok: [managed-node2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/sda", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-69ed5d4e-e0f6-4165-8685-9d059372f9b9", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-69ed5d4e-e0f6-4165-8685-9d059372f9b9", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-69ed5d4e-e0f6-4165-8685-9d059372f9b9", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-69ed5d4e-e0f6-4165-8685-9d059372f9b9" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=8faa4d70-5959-4465-a208-91204d0638f8", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-69ed5d4e-e0f6-4165-8685-9d059372f9b9", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/mapper/luks-69ed5d4e-e0f6-4165-8685-9d059372f9b9", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-69ed5d4e-e0f6-4165-8685-9d059372f9b9", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:130 Friday 30 January 2026 18:42:03 -0500 (0:00:00.167) 0:05:42.343 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:134 Friday 30 January 2026 18:42:03 -0500 (0:00:00.227) 0:05:42.571 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/mapper/luks-69ed5d4e-e0f6-4165-8685-9d059372f9b9", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-69ed5d4e-e0f6-4165-8685-9d059372f9b9", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:150 Friday 30 January 2026 18:42:03 -0500 (0:00:00.225) 0:05:42.796 ******** changed: [managed-node2] => (item={'src': 'UUID=8faa4d70-5959-4465-a208-91204d0638f8', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=8faa4d70-5959-4465-a208-91204d0638f8", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=8faa4d70-5959-4465-a208-91204d0638f8" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Friday 30 January 2026 18:42:05 -0500 (0:00:01.574) 0:05:44.371 ******** ok: [managed-node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166 Friday 30 January 2026 18:42:06 -0500 (0:00:01.437) 0:05:45.809 ******** changed: [managed-node2] => (item={'src': '/dev/mapper/luks-69ed5d4e-e0f6-4165-8685-9d059372f9b9', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-69ed5d4e-e0f6-4165-8685-9d059372f9b9", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-69ed5d4e-e0f6-4165-8685-9d059372f9b9" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:177 Friday 30 January 2026 18:42:07 -0500 (0:00:01.178) 0:05:46.987 ******** skipping: [managed-node2] => (item={'src': '/dev/mapper/luks-69ed5d4e-e0f6-4165-8685-9d059372f9b9', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-69ed5d4e-e0f6-4165-8685-9d059372f9b9", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:189 Friday 30 January 2026 18:42:07 -0500 (0:00:00.161) 0:05:47.148 ******** ok: [managed-node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 Friday 30 January 2026 18:42:09 -0500 (0:00:01.279) 0:05:48.428 ******** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1769816420.8293624, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1769816414.3043795, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 220201156, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1769816414.3033795, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "114599586", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:202 Friday 30 January 2026 18:42:10 -0500 (0:00:01.458) 0:05:49.887 ******** changed: [managed-node2] => (item={'backing_device': '/dev/sda', 'name': 'luks-69ed5d4e-e0f6-4165-8685-9d059372f9b9', 'password': '-', 'state': 'present'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda", "name": "luks-69ed5d4e-e0f6-4165-8685-9d059372f9b9", "password": "-", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 Friday 30 January 2026 18:42:11 -0500 (0:00:01.095) 0:05:50.982 ******** ok: [managed-node2] TASK [Verify role results - 3] ************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:196 Friday 30 January 2026 18:42:13 -0500 (0:00:01.367) 0:05:52.349 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node2 TASK [Print out pool information] ********************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Friday 30 January 2026 18:42:13 -0500 (0:00:00.354) 0:05:52.703 ******** skipping: [managed-node2] => {} TASK [Print out volume information] ******************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Friday 30 January 2026 18:42:13 -0500 (0:00:00.058) 0:05:52.762 ******** ok: [managed-node2] => { "_storage_volumes_list": [ { "_device": "/dev/mapper/luks-69ed5d4e-e0f6-4165-8685-9d059372f9b9", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-69ed5d4e-e0f6-4165-8685-9d059372f9b9", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Friday 30 January 2026 18:42:13 -0500 (0:00:00.157) 0:05:52.919 ******** ok: [managed-node2] => { "changed": false, "info": { "/dev/mapper/luks-69ed5d4e-e0f6-4165-8685-9d059372f9b9": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-69ed5d4e-e0f6-4165-8685-9d059372f9b9", "size": "10G", "type": "crypt", "uuid": "f8aa53a0-f2c1-450a-9003-7b21757ccb71" }, "/dev/sda": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "69ed5d4e-e0f6-4165-8685-9d059372f9b9" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Friday 30 January 2026 18:42:14 -0500 (0:00:01.236) 0:05:54.156 ******** ok: [managed-node2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002870", "end": "2026-01-30 18:42:15.840478", "rc": 0, "start": "2026-01-30 18:42:15.837608" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-69ed5d4e-e0f6-4165-8685-9d059372f9b9 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Friday 30 January 2026 18:42:15 -0500 (0:00:01.088) 0:05:55.245 ******** ok: [managed-node2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002666", "end": "2026-01-30 18:42:16.735535", "failed_when_result": false, "rc": 0, "start": "2026-01-30 18:42:16.732869" } STDOUT: luks-69ed5d4e-e0f6-4165-8685-9d059372f9b9 /dev/sda - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Friday 30 January 2026 18:42:16 -0500 (0:00:00.947) 0:05:56.192 ******** TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Friday 30 January 2026 18:42:16 -0500 (0:00:00.079) 0:05:56.271 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node2 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Friday 30 January 2026 18:42:17 -0500 (0:00:00.216) 0:05:56.488 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Friday 30 January 2026 18:42:17 -0500 (0:00:00.215) 0:05:56.704 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node2 included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node2 included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node2 included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node2 included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node2 included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node2 included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node2 included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Friday 30 January 2026 18:42:18 -0500 (0:00:00.958) 0:05:57.663 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-69ed5d4e-e0f6-4165-8685-9d059372f9b9" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Friday 30 January 2026 18:42:18 -0500 (0:00:00.090) 0:05:57.753 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Friday 30 January 2026 18:42:18 -0500 (0:00:00.085) 0:05:57.838 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Friday 30 January 2026 18:42:18 -0500 (0:00:00.240) 0:05:58.078 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Friday 30 January 2026 18:42:18 -0500 (0:00:00.161) 0:05:58.240 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Friday 30 January 2026 18:42:19 -0500 (0:00:00.164) 0:05:58.404 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Friday 30 January 2026 18:42:19 -0500 (0:00:00.254) 0:05:58.659 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Friday 30 January 2026 18:42:19 -0500 (0:00:00.269) 0:05:58.928 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Friday 30 January 2026 18:42:19 -0500 (0:00:00.177) 0:05:59.106 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Friday 30 January 2026 18:42:20 -0500 (0:00:00.257) 0:05:59.364 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Friday 30 January 2026 18:42:20 -0500 (0:00:00.262) 0:05:59.626 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Friday 30 January 2026 18:42:20 -0500 (0:00:00.175) 0:05:59.802 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-69ed5d4e-e0f6-4165-8685-9d059372f9b9 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Friday 30 January 2026 18:42:20 -0500 (0:00:00.287) 0:06:00.089 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Friday 30 January 2026 18:42:21 -0500 (0:00:00.239) 0:06:00.328 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Friday 30 January 2026 18:42:21 -0500 (0:00:00.242) 0:06:00.571 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Friday 30 January 2026 18:42:21 -0500 (0:00:00.244) 0:06:00.815 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Friday 30 January 2026 18:42:21 -0500 (0:00:00.227) 0:06:01.042 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Friday 30 January 2026 18:42:21 -0500 (0:00:00.211) 0:06:01.254 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Friday 30 January 2026 18:42:22 -0500 (0:00:00.356) 0:06:01.610 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Friday 30 January 2026 18:42:22 -0500 (0:00:00.312) 0:06:01.923 ******** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1769816519.7091055, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1769816519.7091055, "dev": 6, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 35654, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1769816519.7091055, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Friday 30 January 2026 18:42:23 -0500 (0:00:01.278) 0:06:03.202 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Friday 30 January 2026 18:42:24 -0500 (0:00:00.148) 0:06:03.350 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Friday 30 January 2026 18:42:24 -0500 (0:00:00.184) 0:06:03.535 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Friday 30 January 2026 18:42:24 -0500 (0:00:00.193) 0:06:03.729 ******** ok: [managed-node2] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Friday 30 January 2026 18:42:24 -0500 (0:00:00.130) 0:06:03.859 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Friday 30 January 2026 18:42:24 -0500 (0:00:00.177) 0:06:04.037 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Friday 30 January 2026 18:42:24 -0500 (0:00:00.138) 0:06:04.175 ******** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1769816519.847105, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1769816519.847105, "dev": 6, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 163000, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1769816519.847105, "nlink": 1, "path": "/dev/mapper/luks-69ed5d4e-e0f6-4165-8685-9d059372f9b9", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Friday 30 January 2026 18:42:26 -0500 (0:00:01.189) 0:06:05.365 ******** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Friday 30 January 2026 18:42:29 -0500 (0:00:03.894) 0:06:09.259 ******** ok: [managed-node2] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/sda" ], "delta": "0:00:00.010448", "end": "2026-01-30 18:42:31.057077", "rc": 0, "start": "2026-01-30 18:42:31.046629" } STDOUT: LUKS header information Version: 2 Epoch: 3 Metadata area: 16384 [bytes] Keyslots area: 16744448 [bytes] UUID: 69ed5d4e-e0f6-4165-8685-9d059372f9b9 Label: (no label) Subsystem: (no subsystem) Flags: (no flags) Data segments: 0: crypt offset: 16777216 [bytes] length: (whole device) cipher: aes-xts-plain64 sector: 512 [bytes] Keyslots: 0: luks2 Key: 512 bits Priority: normal Cipher: aes-xts-plain64 Cipher key: 512 bits PBKDF: argon2i Time cost: 4 Memory: 902397 Threads: 2 Salt: bd 36 c1 46 72 83 67 8d 54 95 c1 a6 43 e5 22 b9 5c 98 d4 65 14 d6 98 d3 e2 d9 e9 84 28 c8 15 c0 AF stripes: 4000 AF hash: sha256 Area offset:32768 [bytes] Area length:258048 [bytes] Digest ID: 0 Tokens: Digests: 0: pbkdf2 Hash: sha256 Iterations: 119373 Salt: 3b 07 82 ec 0b e1 da ce 60 4c fe 40 b4 38 a6 a0 97 b3 4d bf c8 f1 7d 9f 31 68 29 c8 96 d2 86 9a Digest: b9 1d 08 66 e6 45 cc 72 b2 77 3c 2d 7c 67 4f 8f fd ff de 35 1d f3 44 1a 4a 60 f2 de 58 fe 9c 98 TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Friday 30 January 2026 18:42:31 -0500 (0:00:01.344) 0:06:10.604 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Friday 30 January 2026 18:42:31 -0500 (0:00:00.260) 0:06:10.864 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Friday 30 January 2026 18:42:31 -0500 (0:00:00.262) 0:06:11.127 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Friday 30 January 2026 18:42:32 -0500 (0:00:00.174) 0:06:11.302 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Friday 30 January 2026 18:42:32 -0500 (0:00:00.109) 0:06:11.411 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Friday 30 January 2026 18:42:32 -0500 (0:00:00.359) 0:06:11.770 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Friday 30 January 2026 18:42:32 -0500 (0:00:00.244) 0:06:12.014 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Friday 30 January 2026 18:42:32 -0500 (0:00:00.241) 0:06:12.256 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-69ed5d4e-e0f6-4165-8685-9d059372f9b9 /dev/sda -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Friday 30 January 2026 18:42:33 -0500 (0:00:00.267) 0:06:12.524 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Friday 30 January 2026 18:42:33 -0500 (0:00:00.248) 0:06:12.772 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Friday 30 January 2026 18:42:33 -0500 (0:00:00.255) 0:06:13.028 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Friday 30 January 2026 18:42:34 -0500 (0:00:00.289) 0:06:13.317 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Friday 30 January 2026 18:42:34 -0500 (0:00:00.250) 0:06:13.567 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Friday 30 January 2026 18:42:34 -0500 (0:00:00.179) 0:06:13.747 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Friday 30 January 2026 18:42:34 -0500 (0:00:00.111) 0:06:13.858 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Friday 30 January 2026 18:42:34 -0500 (0:00:00.162) 0:06:14.020 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Friday 30 January 2026 18:42:34 -0500 (0:00:00.194) 0:06:14.215 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Friday 30 January 2026 18:42:35 -0500 (0:00:00.081) 0:06:14.296 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Friday 30 January 2026 18:42:35 -0500 (0:00:00.226) 0:06:14.523 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Friday 30 January 2026 18:42:35 -0500 (0:00:00.180) 0:06:14.704 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Friday 30 January 2026 18:42:35 -0500 (0:00:00.081) 0:06:14.785 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Friday 30 January 2026 18:42:35 -0500 (0:00:00.164) 0:06:14.949 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Friday 30 January 2026 18:42:35 -0500 (0:00:00.211) 0:06:15.161 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Friday 30 January 2026 18:42:35 -0500 (0:00:00.109) 0:06:15.270 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Friday 30 January 2026 18:42:36 -0500 (0:00:00.161) 0:06:15.432 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Friday 30 January 2026 18:42:36 -0500 (0:00:00.210) 0:06:15.642 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Friday 30 January 2026 18:42:36 -0500 (0:00:00.278) 0:06:15.921 ******** ok: [managed-node2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Friday 30 January 2026 18:42:36 -0500 (0:00:00.256) 0:06:16.178 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Friday 30 January 2026 18:42:37 -0500 (0:00:00.294) 0:06:16.472 ******** skipping: [managed-node2] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Friday 30 January 2026 18:42:37 -0500 (0:00:00.175) 0:06:16.647 ******** skipping: [managed-node2] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Friday 30 January 2026 18:42:37 -0500 (0:00:00.183) 0:06:16.830 ******** skipping: [managed-node2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Friday 30 January 2026 18:42:37 -0500 (0:00:00.140) 0:06:16.971 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Friday 30 January 2026 18:42:37 -0500 (0:00:00.163) 0:06:17.134 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Friday 30 January 2026 18:42:37 -0500 (0:00:00.140) 0:06:17.274 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Friday 30 January 2026 18:42:38 -0500 (0:00:00.109) 0:06:17.384 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Friday 30 January 2026 18:42:38 -0500 (0:00:00.175) 0:06:17.559 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Friday 30 January 2026 18:42:38 -0500 (0:00:00.162) 0:06:17.721 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Friday 30 January 2026 18:42:38 -0500 (0:00:00.113) 0:06:17.835 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Friday 30 January 2026 18:42:38 -0500 (0:00:00.129) 0:06:17.964 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Friday 30 January 2026 18:42:38 -0500 (0:00:00.220) 0:06:18.185 ******** skipping: [managed-node2] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Friday 30 January 2026 18:42:39 -0500 (0:00:00.198) 0:06:18.383 ******** skipping: [managed-node2] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Friday 30 January 2026 18:42:39 -0500 (0:00:00.166) 0:06:18.549 ******** skipping: [managed-node2] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Friday 30 January 2026 18:42:39 -0500 (0:00:00.168) 0:06:18.718 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Friday 30 January 2026 18:42:39 -0500 (0:00:00.250) 0:06:18.969 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Friday 30 January 2026 18:42:39 -0500 (0:00:00.246) 0:06:19.216 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Friday 30 January 2026 18:42:40 -0500 (0:00:00.200) 0:06:19.416 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Friday 30 January 2026 18:42:40 -0500 (0:00:00.251) 0:06:19.668 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Friday 30 January 2026 18:42:40 -0500 (0:00:00.252) 0:06:19.920 ******** ok: [managed-node2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Friday 30 January 2026 18:42:40 -0500 (0:00:00.117) 0:06:20.037 ******** ok: [managed-node2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Friday 30 January 2026 18:42:40 -0500 (0:00:00.173) 0:06:20.211 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Friday 30 January 2026 18:42:41 -0500 (0:00:00.235) 0:06:20.447 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Friday 30 January 2026 18:42:41 -0500 (0:00:00.241) 0:06:20.689 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Friday 30 January 2026 18:42:41 -0500 (0:00:00.182) 0:06:20.871 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Friday 30 January 2026 18:42:41 -0500 (0:00:00.187) 0:06:21.059 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Friday 30 January 2026 18:42:41 -0500 (0:00:00.181) 0:06:21.241 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Friday 30 January 2026 18:42:42 -0500 (0:00:00.131) 0:06:21.372 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Friday 30 January 2026 18:42:42 -0500 (0:00:00.216) 0:06:21.589 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Friday 30 January 2026 18:42:42 -0500 (0:00:00.172) 0:06:21.761 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Friday 30 January 2026 18:42:42 -0500 (0:00:00.148) 0:06:21.909 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Test for correct handling of new encrypted volume w/ no key - 2] ********* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:203 Friday 30 January 2026 18:42:42 -0500 (0:00:00.131) 0:06:22.040 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node2 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Friday 30 January 2026 18:42:43 -0500 (0:00:00.395) 0:06:22.435 ******** ok: [managed-node2] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Friday 30 January 2026 18:42:43 -0500 (0:00:00.267) 0:06:22.703 ******** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Friday 30 January 2026 18:42:43 -0500 (0:00:00.239) 0:06:22.943 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Friday 30 January 2026 18:42:43 -0500 (0:00:00.289) 0:06:23.232 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Friday 30 January 2026 18:42:44 -0500 (0:00:00.278) 0:06:23.511 ******** skipping: [managed-node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node2] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node2] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Friday 30 January 2026 18:42:44 -0500 (0:00:00.564) 0:06:24.075 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Friday 30 January 2026 18:42:45 -0500 (0:00:00.261) 0:06:24.337 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Friday 30 January 2026 18:42:45 -0500 (0:00:00.302) 0:06:24.639 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Friday 30 January 2026 18:42:45 -0500 (0:00:00.165) 0:06:24.804 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Friday 30 January 2026 18:42:45 -0500 (0:00:00.214) 0:06:25.019 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Friday 30 January 2026 18:42:46 -0500 (0:00:00.555) 0:06:25.574 ******** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Friday 30 January 2026 18:42:50 -0500 (0:00:04.122) 0:06:29.697 ******** ok: [managed-node2] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": true, "encryption_luks_version": "luks2", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Friday 30 January 2026 18:42:50 -0500 (0:00:00.243) 0:06:29.940 ******** ok: [managed-node2] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Friday 30 January 2026 18:42:50 -0500 (0:00:00.251) 0:06:30.192 ******** ok: [managed-node2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Friday 30 January 2026 18:42:56 -0500 (0:00:05.305) 0:06:35.498 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Friday 30 January 2026 18:42:56 -0500 (0:00:00.380) 0:06:35.878 ******** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Friday 30 January 2026 18:42:56 -0500 (0:00:00.172) 0:06:36.051 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Friday 30 January 2026 18:42:57 -0500 (0:00:00.309) 0:06:36.360 ******** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Friday 30 January 2026 18:42:57 -0500 (0:00:00.249) 0:06:36.609 ******** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Friday 30 January 2026 18:43:01 -0500 (0:00:04.391) 0:06:41.001 ******** ok: [managed-node2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Friday 30 January 2026 18:43:04 -0500 (0:00:02.689) 0:06:43.691 ******** ok: [managed-node2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Friday 30 January 2026 18:43:04 -0500 (0:00:00.274) 0:06:43.965 ******** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Friday 30 January 2026 18:43:04 -0500 (0:00:00.096) 0:06:44.062 ******** fatal: [managed-node2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: encrypted volume 'test1' missing key/password TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111 Friday 30 January 2026 18:43:09 -0500 (0:00:05.117) 0:06:49.179 ******** fatal: [managed-node2]: FAILED! => { "changed": false } MSG: {'msg': "encrypted volume 'test1' missing key/password", 'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'invocation': {'module_args': {'pools': [{'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'partition', 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None}]}], 'volumes': [], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': False, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Friday 30 January 2026 18:43:10 -0500 (0:00:00.186) 0:06:49.366 ******** TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:23 Friday 30 January 2026 18:43:10 -0500 (0:00:00.183) 0:06:49.549 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:28 Friday 30 January 2026 18:43:10 -0500 (0:00:00.156) 0:06:49.706 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:39 Friday 30 January 2026 18:43:10 -0500 (0:00:00.423) 0:06:50.129 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Create an encrypted partition volume w/ default fs] ********************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:223 Friday 30 January 2026 18:43:10 -0500 (0:00:00.139) 0:06:50.269 ******** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Friday 30 January 2026 18:43:11 -0500 (0:00:00.502) 0:06:50.771 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Friday 30 January 2026 18:43:11 -0500 (0:00:00.239) 0:06:51.010 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Friday 30 January 2026 18:43:11 -0500 (0:00:00.168) 0:06:51.179 ******** skipping: [managed-node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node2] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node2] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Friday 30 January 2026 18:43:12 -0500 (0:00:00.429) 0:06:51.609 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Friday 30 January 2026 18:43:12 -0500 (0:00:00.258) 0:06:51.867 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Friday 30 January 2026 18:43:12 -0500 (0:00:00.200) 0:06:52.068 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Friday 30 January 2026 18:43:12 -0500 (0:00:00.197) 0:06:52.266 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Friday 30 January 2026 18:43:13 -0500 (0:00:00.181) 0:06:52.447 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Friday 30 January 2026 18:43:13 -0500 (0:00:00.563) 0:06:53.011 ******** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Friday 30 January 2026 18:43:18 -0500 (0:00:04.397) 0:06:57.408 ******** ok: [managed-node2] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": true, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Friday 30 January 2026 18:43:18 -0500 (0:00:00.278) 0:06:57.686 ******** ok: [managed-node2] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Friday 30 January 2026 18:43:18 -0500 (0:00:00.267) 0:06:57.954 ******** ok: [managed-node2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Friday 30 January 2026 18:43:24 -0500 (0:00:05.382) 0:07:03.336 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Friday 30 January 2026 18:43:24 -0500 (0:00:00.451) 0:07:03.787 ******** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Friday 30 January 2026 18:43:24 -0500 (0:00:00.131) 0:07:03.919 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Friday 30 January 2026 18:43:24 -0500 (0:00:00.201) 0:07:04.121 ******** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Friday 30 January 2026 18:43:24 -0500 (0:00:00.147) 0:07:04.268 ******** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Friday 30 January 2026 18:43:29 -0500 (0:00:04.373) 0:07:08.641 ******** ok: [managed-node2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Friday 30 January 2026 18:43:31 -0500 (0:00:02.534) 0:07:11.176 ******** ok: [managed-node2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Friday 30 January 2026 18:43:32 -0500 (0:00:00.281) 0:07:11.458 ******** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Friday 30 January 2026 18:43:32 -0500 (0:00:00.158) 0:07:11.617 ******** changed: [managed-node2] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-69ed5d4e-e0f6-4165-8685-9d059372f9b9", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-69ed5d4e-e0f6-4165-8685-9d059372f9b9", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sda1", "fs_type": null }, { "action": "create format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-e6b9be7c-8967-4afd-8c12-6d7654b37892", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-e6b9be7c-8967-4afd-8c12-6d7654b37892", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-69ed5d4e-e0f6-4165-8685-9d059372f9b9", "password": "-", "state": "absent" }, { "backing_device": "/dev/sda1", "name": "luks-e6b9be7c-8967-4afd-8c12-6d7654b37892", "password": "-", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-e6b9be7c-8967-4afd-8c12-6d7654b37892" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-69ed5d4e-e0f6-4165-8685-9d059372f9b9", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-e6b9be7c-8967-4afd-8c12-6d7654b37892", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-e6b9be7c-8967-4afd-8c12-6d7654b37892", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-e6b9be7c-8967-4afd-8c12-6d7654b37892", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 Friday 30 January 2026 18:43:46 -0500 (0:00:13.778) 0:07:25.395 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Friday 30 January 2026 18:43:46 -0500 (0:00:00.253) 0:07:25.648 ******** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1769816527.4340858, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "7a0fded8340d63f903a7551348e52168a5c420ee", "ctime": 1769816527.431086, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 322961545, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1769816527.431086, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1425, "uid": 0, "version": "3166550135", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:97 Friday 30 January 2026 18:43:47 -0500 (0:00:01.389) 0:07:27.038 ******** ok: [managed-node2] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Friday 30 January 2026 18:43:48 -0500 (0:00:01.186) 0:07:28.225 ******** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Friday 30 January 2026 18:43:49 -0500 (0:00:00.161) 0:07:28.386 ******** ok: [managed-node2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-69ed5d4e-e0f6-4165-8685-9d059372f9b9", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-69ed5d4e-e0f6-4165-8685-9d059372f9b9", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sda1", "fs_type": null }, { "action": "create format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-e6b9be7c-8967-4afd-8c12-6d7654b37892", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-e6b9be7c-8967-4afd-8c12-6d7654b37892", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-69ed5d4e-e0f6-4165-8685-9d059372f9b9", "password": "-", "state": "absent" }, { "backing_device": "/dev/sda1", "name": "luks-e6b9be7c-8967-4afd-8c12-6d7654b37892", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-e6b9be7c-8967-4afd-8c12-6d7654b37892" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-69ed5d4e-e0f6-4165-8685-9d059372f9b9", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-e6b9be7c-8967-4afd-8c12-6d7654b37892", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-e6b9be7c-8967-4afd-8c12-6d7654b37892", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-e6b9be7c-8967-4afd-8c12-6d7654b37892", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:130 Friday 30 January 2026 18:43:49 -0500 (0:00:00.283) 0:07:28.670 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-e6b9be7c-8967-4afd-8c12-6d7654b37892", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-e6b9be7c-8967-4afd-8c12-6d7654b37892", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:134 Friday 30 January 2026 18:43:49 -0500 (0:00:00.216) 0:07:28.887 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:150 Friday 30 January 2026 18:43:49 -0500 (0:00:00.277) 0:07:29.164 ******** changed: [managed-node2] => (item={'src': '/dev/mapper/luks-69ed5d4e-e0f6-4165-8685-9d059372f9b9', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-69ed5d4e-e0f6-4165-8685-9d059372f9b9", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-69ed5d4e-e0f6-4165-8685-9d059372f9b9" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Friday 30 January 2026 18:43:51 -0500 (0:00:01.577) 0:07:30.742 ******** ok: [managed-node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166 Friday 30 January 2026 18:43:53 -0500 (0:00:01.770) 0:07:32.512 ******** changed: [managed-node2] => (item={'src': '/dev/mapper/luks-e6b9be7c-8967-4afd-8c12-6d7654b37892', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-e6b9be7c-8967-4afd-8c12-6d7654b37892", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-e6b9be7c-8967-4afd-8c12-6d7654b37892" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:177 Friday 30 January 2026 18:43:54 -0500 (0:00:01.615) 0:07:34.128 ******** skipping: [managed-node2] => (item={'src': '/dev/mapper/luks-e6b9be7c-8967-4afd-8c12-6d7654b37892', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-e6b9be7c-8967-4afd-8c12-6d7654b37892", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:189 Friday 30 January 2026 18:43:55 -0500 (0:00:00.229) 0:07:34.358 ******** ok: [managed-node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 Friday 30 January 2026 18:43:56 -0500 (0:00:01.485) 0:07:35.843 ******** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1769816536.7340627, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "257c0d5a77c0b699e1e33296b6152d0482e544b7", "ctime": 1769816531.5690756, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 348128194, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1769816531.5680757, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 53, "uid": 0, "version": "3674267889", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:202 Friday 30 January 2026 18:43:58 -0500 (0:00:01.454) 0:07:37.298 ******** changed: [managed-node2] => (item={'backing_device': '/dev/sda', 'name': 'luks-69ed5d4e-e0f6-4165-8685-9d059372f9b9', 'password': '-', 'state': 'absent'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda", "name": "luks-69ed5d4e-e0f6-4165-8685-9d059372f9b9", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed changed: [managed-node2] => (item={'backing_device': '/dev/sda1', 'name': 'luks-e6b9be7c-8967-4afd-8c12-6d7654b37892', 'password': '-', 'state': 'present'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda1", "name": "luks-e6b9be7c-8967-4afd-8c12-6d7654b37892", "password": "-", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 Friday 30 January 2026 18:44:00 -0500 (0:00:02.391) 0:07:39.690 ******** ok: [managed-node2] TASK [Verify role results - 4] ************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:241 Friday 30 January 2026 18:44:02 -0500 (0:00:01.912) 0:07:41.602 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node2 TASK [Print out pool information] ********************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Friday 30 January 2026 18:44:02 -0500 (0:00:00.542) 0:07:42.145 ******** ok: [managed-node2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-e6b9be7c-8967-4afd-8c12-6d7654b37892", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-e6b9be7c-8967-4afd-8c12-6d7654b37892", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Friday 30 January 2026 18:44:03 -0500 (0:00:00.263) 0:07:42.409 ******** skipping: [managed-node2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Friday 30 January 2026 18:44:03 -0500 (0:00:00.173) 0:07:42.582 ******** ok: [managed-node2] => { "changed": false, "info": { "/dev/mapper/luks-e6b9be7c-8967-4afd-8c12-6d7654b37892": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-e6b9be7c-8967-4afd-8c12-6d7654b37892", "size": "4G", "type": "crypt", "uuid": "ec67f74a-458e-46f8-8809-c52d37fc73c5" }, "/dev/sda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/sda1", "size": "4G", "type": "partition", "uuid": "e6b9be7c-8967-4afd-8c12-6d7654b37892" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Friday 30 January 2026 18:44:04 -0500 (0:00:01.613) 0:07:44.196 ******** ok: [managed-node2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002557", "end": "2026-01-30 18:44:06.183943", "rc": 0, "start": "2026-01-30 18:44:06.181386" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-e6b9be7c-8967-4afd-8c12-6d7654b37892 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Friday 30 January 2026 18:44:06 -0500 (0:00:01.511) 0:07:45.707 ******** ok: [managed-node2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002503", "end": "2026-01-30 18:44:07.736562", "failed_when_result": false, "rc": 0, "start": "2026-01-30 18:44:07.734059" } STDOUT: luks-e6b9be7c-8967-4afd-8c12-6d7654b37892 /dev/sda1 - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Friday 30 January 2026 18:44:07 -0500 (0:00:01.519) 0:07:47.226 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Friday 30 January 2026 18:44:08 -0500 (0:00:00.294) 0:07:47.521 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Friday 30 January 2026 18:44:08 -0500 (0:00:00.170) 0:07:47.692 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Friday 30 January 2026 18:44:08 -0500 (0:00:00.224) 0:07:47.916 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Friday 30 January 2026 18:44:08 -0500 (0:00:00.187) 0:07:48.104 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node2 included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node2 TASK [Set test variables] ****************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Friday 30 January 2026 18:44:09 -0500 (0:00:00.835) 0:07:48.939 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Friday 30 January 2026 18:44:09 -0500 (0:00:00.284) 0:07:49.223 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Friday 30 January 2026 18:44:10 -0500 (0:00:00.203) 0:07:49.427 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Friday 30 January 2026 18:44:10 -0500 (0:00:00.135) 0:07:49.562 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Friday 30 January 2026 18:44:10 -0500 (0:00:00.141) 0:07:49.704 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Friday 30 January 2026 18:44:10 -0500 (0:00:00.117) 0:07:49.821 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type - 2] ************************************************ task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Friday 30 January 2026 18:44:10 -0500 (0:00:00.140) 0:07:49.962 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type - 3] ************************************************ task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Friday 30 January 2026 18:44:10 -0500 (0:00:00.145) 0:07:50.107 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:55 Friday 30 January 2026 18:44:11 -0500 (0:00:00.296) 0:07:50.403 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:68 Friday 30 January 2026 18:44:11 -0500 (0:00:00.172) 0:07:50.576 ******** ok: [managed-node2] => { "changed": false, "failed_when_result": false, "rc": 1 } STDERR: Shared connection to 10.31.47.227 closed. MSG: non-zero return code TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:78 Friday 30 January 2026 18:44:12 -0500 (0:00:01.287) 0:07:51.863 ******** TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:88 Friday 30 January 2026 18:44:12 -0500 (0:00:00.129) 0:07:51.992 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node2 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Friday 30 January 2026 18:44:13 -0500 (0:00:00.329) 0:07:52.321 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Friday 30 January 2026 18:44:13 -0500 (0:00:00.154) 0:07:52.476 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Friday 30 January 2026 18:44:13 -0500 (0:00:00.200) 0:07:52.677 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Friday 30 January 2026 18:44:13 -0500 (0:00:00.219) 0:07:52.896 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Friday 30 January 2026 18:44:13 -0500 (0:00:00.185) 0:07:53.081 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Friday 30 January 2026 18:44:13 -0500 (0:00:00.200) 0:07:53.281 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Friday 30 January 2026 18:44:14 -0500 (0:00:00.159) 0:07:53.441 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Friday 30 January 2026 18:44:14 -0500 (0:00:00.190) 0:07:53.631 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Friday 30 January 2026 18:44:14 -0500 (0:00:00.184) 0:07:53.816 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Friday 30 January 2026 18:44:14 -0500 (0:00:00.184) 0:07:54.001 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Friday 30 January 2026 18:44:14 -0500 (0:00:00.272) 0:07:54.273 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:91 Friday 30 January 2026 18:44:15 -0500 (0:00:00.215) 0:07:54.488 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Friday 30 January 2026 18:44:15 -0500 (0:00:00.396) 0:07:54.885 ******** skipping: [managed-node2] => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-e6b9be7c-8967-4afd-8c12-6d7654b37892', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-e6b9be7c-8967-4afd-8c12-6d7654b37892', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_lvmraid_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_lvmraid_volume": { "_device": "/dev/mapper/luks-e6b9be7c-8967-4afd-8c12-6d7654b37892", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-e6b9be7c-8967-4afd-8c12-6d7654b37892", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:94 Friday 30 January 2026 18:44:15 -0500 (0:00:00.276) 0:07:55.161 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node2 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Friday 30 January 2026 18:44:16 -0500 (0:00:00.466) 0:07:55.628 ******** skipping: [managed-node2] => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-e6b9be7c-8967-4afd-8c12-6d7654b37892', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-e6b9be7c-8967-4afd-8c12-6d7654b37892', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_thin_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_thin_volume": { "_device": "/dev/mapper/luks-e6b9be7c-8967-4afd-8c12-6d7654b37892", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-e6b9be7c-8967-4afd-8c12-6d7654b37892", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check member encryption] ************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:97 Friday 30 January 2026 18:44:16 -0500 (0:00:00.223) 0:07:55.852 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node2 TASK [Set test variables] ****************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Friday 30 January 2026 18:44:17 -0500 (0:00:00.692) 0:07:56.545 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Friday 30 January 2026 18:44:17 -0500 (0:00:00.250) 0:07:56.795 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Friday 30 January 2026 18:44:17 -0500 (0:00:00.336) 0:07:57.131 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Friday 30 January 2026 18:44:17 -0500 (0:00:00.142) 0:07:57.274 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:100 Friday 30 January 2026 18:44:18 -0500 (0:00:00.163) 0:07:57.437 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Friday 30 January 2026 18:44:18 -0500 (0:00:00.390) 0:07:57.828 ******** skipping: [managed-node2] => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-e6b9be7c-8967-4afd-8c12-6d7654b37892', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-e6b9be7c-8967-4afd-8c12-6d7654b37892', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_vdo_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_vdo_volume": { "_device": "/dev/mapper/luks-e6b9be7c-8967-4afd-8c12-6d7654b37892", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-e6b9be7c-8967-4afd-8c12-6d7654b37892", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:103 Friday 30 January 2026 18:44:18 -0500 (0:00:00.324) 0:07:58.152 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node2 TASK [Get stratis pool information] ******************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Friday 30 January 2026 18:44:19 -0500 (0:00:00.576) 0:07:58.729 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print script output] ***************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Friday 30 January 2026 18:44:19 -0500 (0:00:00.386) 0:07:59.115 ******** skipping: [managed-node2] => {} TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:19 Friday 30 January 2026 18:44:20 -0500 (0:00:00.214) 0:07:59.330 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:23 Friday 30 January 2026 18:44:20 -0500 (0:00:00.184) 0:07:59.515 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:30 Friday 30 January 2026 18:44:20 -0500 (0:00:00.229) 0:07:59.744 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:39 Friday 30 January 2026 18:44:20 -0500 (0:00:00.204) 0:07:59.949 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:49 Friday 30 January 2026 18:44:20 -0500 (0:00:00.178) 0:08:00.127 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:106 Friday 30 January 2026 18:44:21 -0500 (0:00:00.163) 0:08:00.290 ******** ok: [managed-node2] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Friday 30 January 2026 18:44:21 -0500 (0:00:00.180) 0:08:00.471 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node2 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Friday 30 January 2026 18:44:21 -0500 (0:00:00.300) 0:08:00.772 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Friday 30 January 2026 18:44:21 -0500 (0:00:00.227) 0:08:00.999 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node2 included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node2 included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node2 included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node2 included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node2 included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node2 included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node2 included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Friday 30 January 2026 18:44:23 -0500 (0:00:01.387) 0:08:02.386 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-e6b9be7c-8967-4afd-8c12-6d7654b37892" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Friday 30 January 2026 18:44:23 -0500 (0:00:00.531) 0:08:02.918 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Friday 30 January 2026 18:44:23 -0500 (0:00:00.282) 0:08:03.200 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Friday 30 January 2026 18:44:24 -0500 (0:00:00.327) 0:08:03.528 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Friday 30 January 2026 18:44:24 -0500 (0:00:00.258) 0:08:03.786 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Friday 30 January 2026 18:44:24 -0500 (0:00:00.238) 0:08:04.025 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Friday 30 January 2026 18:44:25 -0500 (0:00:00.286) 0:08:04.311 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Friday 30 January 2026 18:44:25 -0500 (0:00:00.322) 0:08:04.633 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Friday 30 January 2026 18:44:25 -0500 (0:00:00.192) 0:08:04.826 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Friday 30 January 2026 18:44:25 -0500 (0:00:00.219) 0:08:05.046 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Friday 30 January 2026 18:44:26 -0500 (0:00:00.257) 0:08:05.303 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Friday 30 January 2026 18:44:26 -0500 (0:00:00.182) 0:08:05.486 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-e6b9be7c-8967-4afd-8c12-6d7654b37892 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Friday 30 January 2026 18:44:26 -0500 (0:00:00.549) 0:08:06.036 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Friday 30 January 2026 18:44:27 -0500 (0:00:00.297) 0:08:06.333 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Friday 30 January 2026 18:44:27 -0500 (0:00:00.276) 0:08:06.609 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Friday 30 January 2026 18:44:27 -0500 (0:00:00.308) 0:08:06.917 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Friday 30 January 2026 18:44:27 -0500 (0:00:00.287) 0:08:07.205 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Friday 30 January 2026 18:44:28 -0500 (0:00:00.129) 0:08:07.335 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Friday 30 January 2026 18:44:28 -0500 (0:00:00.307) 0:08:07.642 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Friday 30 January 2026 18:44:28 -0500 (0:00:00.357) 0:08:08.000 ******** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1769816625.6818388, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1769816625.6818388, "dev": 6, "device_type": 2049, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 175332, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1769816625.6818388, "nlink": 1, "path": "/dev/sda1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Friday 30 January 2026 18:44:30 -0500 (0:00:01.430) 0:08:09.431 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Friday 30 January 2026 18:44:30 -0500 (0:00:00.227) 0:08:09.659 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Friday 30 January 2026 18:44:30 -0500 (0:00:00.112) 0:08:09.771 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Friday 30 January 2026 18:44:30 -0500 (0:00:00.198) 0:08:09.970 ******** ok: [managed-node2] => { "ansible_facts": { "st_volume_type": "partition" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Friday 30 January 2026 18:44:30 -0500 (0:00:00.238) 0:08:10.208 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Friday 30 January 2026 18:44:31 -0500 (0:00:00.274) 0:08:10.483 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Friday 30 January 2026 18:44:31 -0500 (0:00:00.230) 0:08:10.714 ******** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1769816625.8328385, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1769816625.8328385, "dev": 6, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 175530, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1769816625.8328385, "nlink": 1, "path": "/dev/mapper/luks-e6b9be7c-8967-4afd-8c12-6d7654b37892", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Friday 30 January 2026 18:44:32 -0500 (0:00:01.446) 0:08:12.161 ******** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Friday 30 January 2026 18:44:36 -0500 (0:00:03.904) 0:08:16.065 ******** ok: [managed-node2] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/sda1" ], "delta": "0:00:00.010450", "end": "2026-01-30 18:44:37.841030", "rc": 0, "start": "2026-01-30 18:44:37.830580" } STDOUT: LUKS header information Version: 2 Epoch: 3 Metadata area: 16384 [bytes] Keyslots area: 16744448 [bytes] UUID: e6b9be7c-8967-4afd-8c12-6d7654b37892 Label: (no label) Subsystem: (no subsystem) Flags: (no flags) Data segments: 0: crypt offset: 16777216 [bytes] length: (whole device) cipher: aes-xts-plain64 sector: 512 [bytes] Keyslots: 0: luks2 Key: 512 bits Priority: normal Cipher: aes-xts-plain64 Cipher key: 512 bits PBKDF: argon2i Time cost: 4 Memory: 902397 Threads: 2 Salt: ee 19 05 ab f9 cf 61 65 a3 18 18 80 7d 6c aa a8 f7 5e 63 68 a8 2b f4 7f fd 1d 77 fa 56 d0 49 cd AF stripes: 4000 AF hash: sha256 Area offset:32768 [bytes] Area length:258048 [bytes] Digest ID: 0 Tokens: Digests: 0: pbkdf2 Hash: sha256 Iterations: 120029 Salt: 33 e6 4b 51 80 bf 7e 41 50 c5 8e 35 c4 0f 1f 46 55 87 07 a3 7e 7f c7 97 2c dc 53 06 00 33 59 75 Digest: 30 dd 0c a5 de 9d 03 88 8b 67 1b 34 cc 25 98 dd 8b ad 6a ad 53 5c a0 b9 86 0a df 80 fb 2a bc 44 TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Friday 30 January 2026 18:44:38 -0500 (0:00:01.273) 0:08:17.338 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Friday 30 January 2026 18:44:38 -0500 (0:00:00.272) 0:08:17.611 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Friday 30 January 2026 18:44:38 -0500 (0:00:00.237) 0:08:17.848 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Friday 30 January 2026 18:44:38 -0500 (0:00:00.128) 0:08:17.977 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Friday 30 January 2026 18:44:38 -0500 (0:00:00.091) 0:08:18.069 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Friday 30 January 2026 18:44:39 -0500 (0:00:00.342) 0:08:18.412 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Friday 30 January 2026 18:44:39 -0500 (0:00:00.198) 0:08:18.610 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Friday 30 January 2026 18:44:39 -0500 (0:00:00.158) 0:08:18.769 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-e6b9be7c-8967-4afd-8c12-6d7654b37892 /dev/sda1 -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Friday 30 January 2026 18:44:39 -0500 (0:00:00.171) 0:08:18.940 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Friday 30 January 2026 18:44:39 -0500 (0:00:00.224) 0:08:19.165 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Friday 30 January 2026 18:44:40 -0500 (0:00:00.249) 0:08:19.414 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Friday 30 January 2026 18:44:40 -0500 (0:00:00.307) 0:08:19.721 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Friday 30 January 2026 18:44:40 -0500 (0:00:00.220) 0:08:19.942 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Friday 30 January 2026 18:44:40 -0500 (0:00:00.211) 0:08:20.154 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Friday 30 January 2026 18:44:41 -0500 (0:00:00.207) 0:08:20.361 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Friday 30 January 2026 18:44:41 -0500 (0:00:00.200) 0:08:20.562 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Friday 30 January 2026 18:44:41 -0500 (0:00:00.245) 0:08:20.808 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Friday 30 January 2026 18:44:41 -0500 (0:00:00.252) 0:08:21.060 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Friday 30 January 2026 18:44:41 -0500 (0:00:00.226) 0:08:21.286 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Friday 30 January 2026 18:44:42 -0500 (0:00:00.277) 0:08:21.563 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Friday 30 January 2026 18:44:42 -0500 (0:00:00.340) 0:08:21.904 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Friday 30 January 2026 18:44:42 -0500 (0:00:00.314) 0:08:22.219 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Friday 30 January 2026 18:44:43 -0500 (0:00:00.235) 0:08:22.455 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Friday 30 January 2026 18:44:43 -0500 (0:00:00.292) 0:08:22.748 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Friday 30 January 2026 18:44:43 -0500 (0:00:00.251) 0:08:22.999 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Friday 30 January 2026 18:44:44 -0500 (0:00:00.349) 0:08:23.349 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Friday 30 January 2026 18:44:44 -0500 (0:00:00.219) 0:08:23.568 ******** ok: [managed-node2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Friday 30 January 2026 18:44:44 -0500 (0:00:00.294) 0:08:23.862 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Friday 30 January 2026 18:44:44 -0500 (0:00:00.287) 0:08:24.150 ******** skipping: [managed-node2] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Friday 30 January 2026 18:44:45 -0500 (0:00:00.370) 0:08:24.520 ******** skipping: [managed-node2] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Friday 30 January 2026 18:44:45 -0500 (0:00:00.203) 0:08:24.724 ******** skipping: [managed-node2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Friday 30 January 2026 18:44:45 -0500 (0:00:00.165) 0:08:24.889 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Friday 30 January 2026 18:44:45 -0500 (0:00:00.264) 0:08:25.153 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Friday 30 January 2026 18:44:46 -0500 (0:00:00.266) 0:08:25.420 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Friday 30 January 2026 18:44:46 -0500 (0:00:00.238) 0:08:25.659 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Friday 30 January 2026 18:44:46 -0500 (0:00:00.234) 0:08:25.894 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Friday 30 January 2026 18:44:46 -0500 (0:00:00.178) 0:08:26.072 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Friday 30 January 2026 18:44:47 -0500 (0:00:00.240) 0:08:26.313 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Friday 30 January 2026 18:44:47 -0500 (0:00:00.317) 0:08:26.631 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Friday 30 January 2026 18:44:47 -0500 (0:00:00.214) 0:08:26.845 ******** skipping: [managed-node2] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Friday 30 January 2026 18:44:47 -0500 (0:00:00.215) 0:08:27.061 ******** skipping: [managed-node2] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Friday 30 January 2026 18:44:48 -0500 (0:00:00.289) 0:08:27.350 ******** skipping: [managed-node2] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Friday 30 January 2026 18:44:48 -0500 (0:00:00.271) 0:08:27.621 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Friday 30 January 2026 18:44:48 -0500 (0:00:00.274) 0:08:27.895 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Friday 30 January 2026 18:44:48 -0500 (0:00:00.300) 0:08:28.196 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Friday 30 January 2026 18:44:49 -0500 (0:00:00.269) 0:08:28.465 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Friday 30 January 2026 18:44:49 -0500 (0:00:00.242) 0:08:28.708 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Friday 30 January 2026 18:44:49 -0500 (0:00:00.219) 0:08:28.927 ******** ok: [managed-node2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Friday 30 January 2026 18:44:49 -0500 (0:00:00.243) 0:08:29.171 ******** ok: [managed-node2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Friday 30 January 2026 18:44:50 -0500 (0:00:00.255) 0:08:29.426 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Friday 30 January 2026 18:44:50 -0500 (0:00:00.284) 0:08:29.711 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Friday 30 January 2026 18:44:50 -0500 (0:00:00.335) 0:08:30.046 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Friday 30 January 2026 18:44:50 -0500 (0:00:00.218) 0:08:30.265 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Friday 30 January 2026 18:44:51 -0500 (0:00:00.299) 0:08:30.564 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Friday 30 January 2026 18:44:51 -0500 (0:00:00.288) 0:08:30.853 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Friday 30 January 2026 18:44:51 -0500 (0:00:00.225) 0:08:31.078 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Friday 30 January 2026 18:44:52 -0500 (0:00:00.228) 0:08:31.307 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Friday 30 January 2026 18:44:52 -0500 (0:00:00.210) 0:08:31.518 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Friday 30 January 2026 18:44:52 -0500 (0:00:00.136) 0:08:31.654 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Friday 30 January 2026 18:44:52 -0500 (0:00:00.253) 0:08:31.908 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Friday 30 January 2026 18:44:52 -0500 (0:00:00.164) 0:08:32.073 ******** changed: [managed-node2] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode - 3] ****************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:247 Friday 30 January 2026 18:44:54 -0500 (0:00:01.540) 0:08:33.614 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node2 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Friday 30 January 2026 18:44:54 -0500 (0:00:00.660) 0:08:34.274 ******** ok: [managed-node2] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Friday 30 January 2026 18:44:55 -0500 (0:00:00.265) 0:08:34.540 ******** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Friday 30 January 2026 18:44:55 -0500 (0:00:00.291) 0:08:34.831 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Friday 30 January 2026 18:44:55 -0500 (0:00:00.289) 0:08:35.120 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Friday 30 January 2026 18:44:56 -0500 (0:00:00.311) 0:08:35.431 ******** skipping: [managed-node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node2] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node2] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Friday 30 January 2026 18:44:56 -0500 (0:00:00.544) 0:08:35.976 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Friday 30 January 2026 18:44:56 -0500 (0:00:00.157) 0:08:36.133 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Friday 30 January 2026 18:44:56 -0500 (0:00:00.121) 0:08:36.255 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Friday 30 January 2026 18:44:57 -0500 (0:00:00.058) 0:08:36.314 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Friday 30 January 2026 18:44:57 -0500 (0:00:00.081) 0:08:36.395 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Friday 30 January 2026 18:44:57 -0500 (0:00:00.432) 0:08:36.828 ******** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Friday 30 January 2026 18:45:01 -0500 (0:00:04.107) 0:08:40.935 ******** ok: [managed-node2] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": false, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Friday 30 January 2026 18:45:02 -0500 (0:00:00.361) 0:08:41.296 ******** ok: [managed-node2] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Friday 30 January 2026 18:45:02 -0500 (0:00:00.341) 0:08:41.638 ******** ok: [managed-node2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Friday 30 January 2026 18:45:07 -0500 (0:00:05.460) 0:08:47.099 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Friday 30 January 2026 18:45:08 -0500 (0:00:00.282) 0:08:47.382 ******** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Friday 30 January 2026 18:45:08 -0500 (0:00:00.128) 0:08:47.511 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Friday 30 January 2026 18:45:08 -0500 (0:00:00.221) 0:08:47.733 ******** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Friday 30 January 2026 18:45:08 -0500 (0:00:00.129) 0:08:47.862 ******** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Friday 30 January 2026 18:45:12 -0500 (0:00:04.200) 0:08:52.063 ******** ok: [managed-node2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luk...de0f6\\x2d4165\\x2d8685\\x2d9d059372f9b9.service": { "name": "systemd-cryptsetup@luk...de0f6\\x2d4165\\x2d8685\\x2d9d059372f9b9.service", "source": "systemd", "state": "inactive", "status": "generated" }, "systemd-cryptsetup@luks\\x2d69ed5d4e\\x2de0f6\\x2d4165\\x2d8685\\x2d9d059372f9b9.service": { "name": "systemd-cryptsetup@luks\\x2d69ed5d4e\\x2de0f6\\x2d4165\\x2d8685\\x2d9d059372f9b9.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Friday 30 January 2026 18:45:15 -0500 (0:00:02.617) 0:08:54.681 ******** ok: [managed-node2] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2d69ed5d4e\\x2de0f6\\x2d4165\\x2d8685\\x2d9d059372f9b9.service", "systemd-cryptsetup@luk...de0f6\\x2d4165\\x2d8685\\x2d9d059372f9b9.service" ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Friday 30 January 2026 18:45:15 -0500 (0:00:00.385) 0:08:55.066 ******** changed: [managed-node2] => (item=systemd-cryptsetup@luks\x2d69ed5d4e\x2de0f6\x2d4165\x2d8685\x2d9d059372f9b9.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d69ed5d4e\\x2de0f6\\x2d4165\\x2d8685\\x2d9d059372f9b9.service", "name": "systemd-cryptsetup@luks\\x2d69ed5d4e\\x2de0f6\\x2d4165\\x2d8685\\x2d9d059372f9b9.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "system-systemd\\x2dcryptsetup.slice systemd-journald.socket cryptsetup-pre.target dev-sda.device", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "cryptsetup.target umount.target", "BindsTo": "dev-sda.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-69ed5d4e-e0f6-4165-8685-9d059372f9b9", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-69ed5d4e-e0f6-4165-8685-9d059372f9b9 /dev/sda - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-69ed5d4e-e0f6-4165-8685-9d059372f9b9 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d69ed5d4e\\x2de0f6\\x2d4165\\x2d8685\\x2d9d059372f9b9.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2d69ed5d4e\\x2de0f6\\x2d4165\\x2d8685\\x2d9d059372f9b9.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2d69ed5d4e\\x2de0f6\\x2d4165\\x2d8685\\x2d9d059372f9b9.service", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Fri 2026-01-30 18:43:56 EST", "StateChangeTimestampMonotonic": "2042253265", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "infinity", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "WantedBy": "dev-sda.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node2] => (item=systemd-cryptsetup@luk...de0f6\x2d4165\x2d8685\x2d9d059372f9b9.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...de0f6\\x2d4165\\x2d8685\\x2d9d059372f9b9.service", "name": "systemd-cryptsetup@luk...de0f6\\x2d4165\\x2d8685\\x2d9d059372f9b9.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...de0f6\\x2d4165\\x2d8685\\x2d9d059372f9b9.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...de0f6\\x2d4165\\x2d8685\\x2d9d059372f9b9.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.NoSuchUnit \"Unit systemd-cryptsetup@luk...de0f6\\x2d4165\\x2d8685\\x2d9d059372f9b9.service not found.\"", "LoadState": "not-found", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...de0f6\\x2d4165\\x2d8685\\x2d9d059372f9b9.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Friday 30 January 2026 18:45:19 -0500 (0:00:03.469) 0:08:58.536 ******** fatal: [managed-node2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'luks-e6b9be7c-8967-4afd-8c12-6d7654b37892' in safe mode due to encryption removal TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111 Friday 30 January 2026 18:45:25 -0500 (0:00:05.848) 0:09:04.385 ******** fatal: [managed-node2]: FAILED! => { "changed": false } MSG: {'msg': "cannot remove existing formatting on device 'luks-e6b9be7c-8967-4afd-8c12-6d7654b37892' in safe mode due to encryption removal", 'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'invocation': {'module_args': {'pools': [{'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'partition', 'volumes': [{'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None}]}], 'volumes': [], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': True, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Friday 30 January 2026 18:45:25 -0500 (0:00:00.321) 0:09:04.706 ******** changed: [managed-node2] => (item=systemd-cryptsetup@luks\x2d69ed5d4e\x2de0f6\x2d4165\x2d8685\x2d9d059372f9b9.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d69ed5d4e\\x2de0f6\\x2d4165\\x2d8685\\x2d9d059372f9b9.service", "name": "systemd-cryptsetup@luks\\x2d69ed5d4e\\x2de0f6\\x2d4165\\x2d8685\\x2d9d059372f9b9.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d69ed5d4e\\x2de0f6\\x2d4165\\x2d8685\\x2d9d059372f9b9.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2d69ed5d4e\\x2de0f6\\x2d4165\\x2d8685\\x2d9d059372f9b9.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2d69ed5d4e\\x2de0f6\\x2d4165\\x2d8685\\x2d9d059372f9b9.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2d69ed5d4e\\x2de0f6\\x2d4165\\x2d8685\\x2d9d059372f9b9.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node2] => (item=systemd-cryptsetup@luk...de0f6\x2d4165\x2d8685\x2d9d059372f9b9.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...de0f6\\x2d4165\\x2d8685\\x2d9d059372f9b9.service", "name": "systemd-cryptsetup@luk...de0f6\\x2d4165\\x2d8685\\x2d9d059372f9b9.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...de0f6\\x2d4165\\x2d8685\\x2d9d059372f9b9.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...de0f6\\x2d4165\\x2d8685\\x2d9d059372f9b9.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luk...de0f6\\x2d4165\\x2d8685\\x2d9d059372f9b9.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...de0f6\\x2d4165\\x2d8685\\x2d9d059372f9b9.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:23 Friday 30 January 2026 18:45:28 -0500 (0:00:03.364) 0:09:08.071 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:28 Friday 30 January 2026 18:45:28 -0500 (0:00:00.183) 0:09:08.254 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:39 Friday 30 January 2026 18:45:29 -0500 (0:00:00.270) 0:09:08.525 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Friday 30 January 2026 18:45:29 -0500 (0:00:00.156) 0:09:08.681 ******** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1769816693.999667, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1769816693.999667, "dev": 64768, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1769816693.999667, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "1507453010", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Friday 30 January 2026 18:45:30 -0500 (0:00:01.164) 0:09:09.846 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Remove the encryption layer - 2] ***************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:272 Friday 30 January 2026 18:45:30 -0500 (0:00:00.242) 0:09:10.089 ******** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Friday 30 January 2026 18:45:31 -0500 (0:00:00.950) 0:09:11.039 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Friday 30 January 2026 18:45:32 -0500 (0:00:00.319) 0:09:11.358 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Friday 30 January 2026 18:45:32 -0500 (0:00:00.387) 0:09:11.746 ******** skipping: [managed-node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node2] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node2] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Friday 30 January 2026 18:45:32 -0500 (0:00:00.514) 0:09:12.261 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Friday 30 January 2026 18:45:33 -0500 (0:00:00.290) 0:09:12.552 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Friday 30 January 2026 18:45:33 -0500 (0:00:00.254) 0:09:12.806 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Friday 30 January 2026 18:45:33 -0500 (0:00:00.243) 0:09:13.050 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Friday 30 January 2026 18:45:34 -0500 (0:00:00.407) 0:09:13.457 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Friday 30 January 2026 18:45:34 -0500 (0:00:00.527) 0:09:13.985 ******** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Friday 30 January 2026 18:45:39 -0500 (0:00:04.462) 0:09:18.447 ******** ok: [managed-node2] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": false, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Friday 30 January 2026 18:45:39 -0500 (0:00:00.326) 0:09:18.774 ******** ok: [managed-node2] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Friday 30 January 2026 18:45:39 -0500 (0:00:00.270) 0:09:19.045 ******** ok: [managed-node2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Friday 30 January 2026 18:45:45 -0500 (0:00:05.565) 0:09:24.610 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Friday 30 January 2026 18:45:45 -0500 (0:00:00.384) 0:09:24.994 ******** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Friday 30 January 2026 18:45:45 -0500 (0:00:00.277) 0:09:25.272 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Friday 30 January 2026 18:45:46 -0500 (0:00:00.311) 0:09:25.583 ******** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Friday 30 January 2026 18:45:46 -0500 (0:00:00.165) 0:09:25.748 ******** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Friday 30 January 2026 18:45:51 -0500 (0:00:04.546) 0:09:30.294 ******** ok: [managed-node2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luk...d8967\\x2d4afd\\x2d8c12\\x2d6d7654b37892.service": { "name": "systemd-cryptsetup@luk...d8967\\x2d4afd\\x2d8c12\\x2d6d7654b37892.service", "source": "systemd", "state": "inactive", "status": "generated" }, "systemd-cryptsetup@luks\\x2de6b9be7c\\x2d8967\\x2d4afd\\x2d8c12\\x2d6d7654b37892.service": { "name": "systemd-cryptsetup@luks\\x2de6b9be7c\\x2d8967\\x2d4afd\\x2d8c12\\x2d6d7654b37892.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Friday 30 January 2026 18:45:53 -0500 (0:00:02.920) 0:09:33.215 ******** ok: [managed-node2] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2de6b9be7c\\x2d8967\\x2d4afd\\x2d8c12\\x2d6d7654b37892.service", "systemd-cryptsetup@luk...d8967\\x2d4afd\\x2d8c12\\x2d6d7654b37892.service" ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Friday 30 January 2026 18:45:54 -0500 (0:00:00.433) 0:09:33.648 ******** changed: [managed-node2] => (item=systemd-cryptsetup@luks\x2de6b9be7c\x2d8967\x2d4afd\x2d8c12\x2d6d7654b37892.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2de6b9be7c\\x2d8967\\x2d4afd\\x2d8c12\\x2d6d7654b37892.service", "name": "systemd-cryptsetup@luks\\x2de6b9be7c\\x2d8967\\x2d4afd\\x2d8c12\\x2d6d7654b37892.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "system-systemd\\x2dcryptsetup.slice cryptsetup-pre.target dev-sda1.device systemd-journald.socket", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "umount.target cryptsetup.target", "BindsTo": "dev-sda1.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-e6b9be7c-8967-4afd-8c12-6d7654b37892", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-e6b9be7c-8967-4afd-8c12-6d7654b37892 /dev/sda1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-e6b9be7c-8967-4afd-8c12-6d7654b37892 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2de6b9be7c\\x2d8967\\x2d4afd\\x2d8c12\\x2d6d7654b37892.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2de6b9be7c\\x2d8967\\x2d4afd\\x2d8c12\\x2d6d7654b37892.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2de6b9be7c\\x2d8967\\x2d4afd\\x2d8c12\\x2d6d7654b37892.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "dev-mapper-luks\\x2de6b9be7c\\x2d8967\\x2d4afd\\x2d8c12\\x2d6d7654b37892.device cryptsetup.target", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Fri 2026-01-30 18:45:18 EST", "StateChangeTimestampMonotonic": "2125016149", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "infinity", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "WantedBy": "dev-sda1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node2] => (item=systemd-cryptsetup@luk...d8967\x2d4afd\x2d8c12\x2d6d7654b37892.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...d8967\\x2d4afd\\x2d8c12\\x2d6d7654b37892.service", "name": "systemd-cryptsetup@luk...d8967\\x2d4afd\\x2d8c12\\x2d6d7654b37892.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...d8967\\x2d4afd\\x2d8c12\\x2d6d7654b37892.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...d8967\\x2d4afd\\x2d8c12\\x2d6d7654b37892.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.NoSuchUnit \"Unit systemd-cryptsetup@luk...d8967\\x2d4afd\\x2d8c12\\x2d6d7654b37892.service not found.\"", "LoadState": "not-found", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...d8967\\x2d4afd\\x2d8c12\\x2d6d7654b37892.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Friday 30 January 2026 18:45:57 -0500 (0:00:03.461) 0:09:37.110 ******** changed: [managed-node2] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-e6b9be7c-8967-4afd-8c12-6d7654b37892", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-e6b9be7c-8967-4afd-8c12-6d7654b37892", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-e6b9be7c-8967-4afd-8c12-6d7654b37892", "password": "-", "state": "absent" } ], "leaves": [ "/dev/sda1", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-e6b9be7c-8967-4afd-8c12-6d7654b37892", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=213e64bb-74a7-4762-88f7-2ea4a358452a", "state": "mounted" } ], "packages": [ "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=213e64bb-74a7-4762-88f7-2ea4a358452a", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 Friday 30 January 2026 18:46:03 -0500 (0:00:06.029) 0:09:43.139 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Friday 30 January 2026 18:46:03 -0500 (0:00:00.137) 0:09:43.276 ******** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1769816634.4668167, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "67cb0c654c8220f2b0dfc07fd6d72f456f356631", "ctime": 1769816634.4638166, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 322961545, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1769816634.4638166, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1425, "uid": 0, "version": "3166550135", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:97 Friday 30 January 2026 18:46:05 -0500 (0:00:01.180) 0:09:44.457 ******** ok: [managed-node2] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Friday 30 January 2026 18:46:06 -0500 (0:00:01.637) 0:09:46.095 ******** changed: [managed-node2] => (item=systemd-cryptsetup@luks\x2de6b9be7c\x2d8967\x2d4afd\x2d8c12\x2d6d7654b37892.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2de6b9be7c\\x2d8967\\x2d4afd\\x2d8c12\\x2d6d7654b37892.service", "name": "systemd-cryptsetup@luks\\x2de6b9be7c\\x2d8967\\x2d4afd\\x2d8c12\\x2d6d7654b37892.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2de6b9be7c\\x2d8967\\x2d4afd\\x2d8c12\\x2d6d7654b37892.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2de6b9be7c\\x2d8967\\x2d4afd\\x2d8c12\\x2d6d7654b37892.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2de6b9be7c\\x2d8967\\x2d4afd\\x2d8c12\\x2d6d7654b37892.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2de6b9be7c\\x2d8967\\x2d4afd\\x2d8c12\\x2d6d7654b37892.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target dev-mapper-luks\\x2de6b9be7c\\x2d8967\\x2d4afd\\x2d8c12\\x2d6d7654b37892.device", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Fri 2026-01-30 18:45:18 EST", "StateChangeTimestampMonotonic": "2125016149", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WantedBy": "dev-sda1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node2] => (item=systemd-cryptsetup@luk...d8967\x2d4afd\x2d8c12\x2d6d7654b37892.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...d8967\\x2d4afd\\x2d8c12\\x2d6d7654b37892.service", "name": "systemd-cryptsetup@luk...d8967\\x2d4afd\\x2d8c12\\x2d6d7654b37892.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...d8967\\x2d4afd\\x2d8c12\\x2d6d7654b37892.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...d8967\\x2d4afd\\x2d8c12\\x2d6d7654b37892.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luk...d8967\\x2d4afd\\x2d8c12\\x2d6d7654b37892.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...d8967\\x2d4afd\\x2d8c12\\x2d6d7654b37892.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Friday 30 January 2026 18:46:10 -0500 (0:00:03.635) 0:09:49.731 ******** ok: [managed-node2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-e6b9be7c-8967-4afd-8c12-6d7654b37892", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-e6b9be7c-8967-4afd-8c12-6d7654b37892", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-e6b9be7c-8967-4afd-8c12-6d7654b37892", "password": "-", "state": "absent" } ], "failed": false, "leaves": [ "/dev/sda1", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-e6b9be7c-8967-4afd-8c12-6d7654b37892", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=213e64bb-74a7-4762-88f7-2ea4a358452a", "state": "mounted" } ], "packages": [ "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=213e64bb-74a7-4762-88f7-2ea4a358452a", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:130 Friday 30 January 2026 18:46:10 -0500 (0:00:00.273) 0:09:50.004 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=213e64bb-74a7-4762-88f7-2ea4a358452a", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:134 Friday 30 January 2026 18:46:10 -0500 (0:00:00.196) 0:09:50.200 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:150 Friday 30 January 2026 18:46:11 -0500 (0:00:00.202) 0:09:50.403 ******** changed: [managed-node2] => (item={'src': '/dev/mapper/luks-e6b9be7c-8967-4afd-8c12-6d7654b37892', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-e6b9be7c-8967-4afd-8c12-6d7654b37892", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-e6b9be7c-8967-4afd-8c12-6d7654b37892" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Friday 30 January 2026 18:46:12 -0500 (0:00:01.528) 0:09:51.932 ******** ok: [managed-node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166 Friday 30 January 2026 18:46:14 -0500 (0:00:01.754) 0:09:53.686 ******** changed: [managed-node2] => (item={'src': 'UUID=213e64bb-74a7-4762-88f7-2ea4a358452a', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=213e64bb-74a7-4762-88f7-2ea4a358452a", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=213e64bb-74a7-4762-88f7-2ea4a358452a" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:177 Friday 30 January 2026 18:46:15 -0500 (0:00:01.318) 0:09:55.005 ******** skipping: [managed-node2] => (item={'src': 'UUID=213e64bb-74a7-4762-88f7-2ea4a358452a', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=213e64bb-74a7-4762-88f7-2ea4a358452a", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:189 Friday 30 January 2026 18:46:15 -0500 (0:00:00.201) 0:09:55.206 ******** ok: [managed-node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 Friday 30 January 2026 18:46:17 -0500 (0:00:01.684) 0:09:56.891 ******** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1769816647.7357833, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "434979f22986ee5e94ae9ce72686e02e2e55c427", "ctime": 1769816640.1318026, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 463470725, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1769816640.1308024, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 54, "uid": 0, "version": "1182510684", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:202 Friday 30 January 2026 18:46:18 -0500 (0:00:01.198) 0:09:58.090 ******** changed: [managed-node2] => (item={'backing_device': '/dev/sda1', 'name': 'luks-e6b9be7c-8967-4afd-8c12-6d7654b37892', 'password': '-', 'state': 'absent'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda1", "name": "luks-e6b9be7c-8967-4afd-8c12-6d7654b37892", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 Friday 30 January 2026 18:46:20 -0500 (0:00:01.228) 0:09:59.319 ******** ok: [managed-node2] TASK [Verify role results - 5] ************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:290 Friday 30 January 2026 18:46:21 -0500 (0:00:01.745) 0:10:01.064 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node2 TASK [Print out pool information] ********************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Friday 30 January 2026 18:46:22 -0500 (0:00:00.626) 0:10:01.691 ******** ok: [managed-node2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=213e64bb-74a7-4762-88f7-2ea4a358452a", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Friday 30 January 2026 18:46:22 -0500 (0:00:00.568) 0:10:02.260 ******** skipping: [managed-node2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Friday 30 January 2026 18:46:23 -0500 (0:00:00.156) 0:10:02.416 ******** ok: [managed-node2] => { "changed": false, "info": { "/dev/sda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/sda1", "size": "4G", "type": "partition", "uuid": "213e64bb-74a7-4762-88f7-2ea4a358452a" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Friday 30 January 2026 18:46:24 -0500 (0:00:01.353) 0:10:03.770 ******** ok: [managed-node2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002675", "end": "2026-01-30 18:46:25.489164", "rc": 0, "start": "2026-01-30 18:46:25.486489" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 UUID=213e64bb-74a7-4762-88f7-2ea4a358452a /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Friday 30 January 2026 18:46:25 -0500 (0:00:01.290) 0:10:05.061 ******** ok: [managed-node2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:01.003699", "end": "2026-01-30 18:46:27.817585", "failed_when_result": false, "rc": 0, "start": "2026-01-30 18:46:26.813886" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Friday 30 January 2026 18:46:28 -0500 (0:00:02.441) 0:10:07.502 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Friday 30 January 2026 18:46:28 -0500 (0:00:00.311) 0:10:07.814 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Friday 30 January 2026 18:46:28 -0500 (0:00:00.106) 0:10:07.921 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Friday 30 January 2026 18:46:28 -0500 (0:00:00.193) 0:10:08.114 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Friday 30 January 2026 18:46:29 -0500 (0:00:00.260) 0:10:08.375 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node2 included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node2 TASK [Set test variables] ****************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Friday 30 January 2026 18:46:29 -0500 (0:00:00.334) 0:10:08.709 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Friday 30 January 2026 18:46:29 -0500 (0:00:00.240) 0:10:08.950 ******** TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Friday 30 January 2026 18:46:29 -0500 (0:00:00.110) 0:10:09.060 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Friday 30 January 2026 18:46:29 -0500 (0:00:00.094) 0:10:09.154 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Friday 30 January 2026 18:46:29 -0500 (0:00:00.107) 0:10:09.262 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Friday 30 January 2026 18:46:30 -0500 (0:00:00.189) 0:10:09.451 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type - 2] ************************************************ task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Friday 30 January 2026 18:46:30 -0500 (0:00:00.145) 0:10:09.597 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type - 3] ************************************************ task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Friday 30 January 2026 18:46:30 -0500 (0:00:00.107) 0:10:09.705 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:55 Friday 30 January 2026 18:46:30 -0500 (0:00:00.146) 0:10:09.852 ******** TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:68 Friday 30 January 2026 18:46:30 -0500 (0:00:00.182) 0:10:10.034 ******** ok: [managed-node2] => { "changed": false, "failed_when_result": false, "rc": 1 } STDERR: Shared connection to 10.31.47.227 closed. MSG: non-zero return code TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:78 Friday 30 January 2026 18:46:31 -0500 (0:00:00.836) 0:10:10.871 ******** TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:88 Friday 30 January 2026 18:46:31 -0500 (0:00:00.111) 0:10:10.983 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node2 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Friday 30 January 2026 18:46:32 -0500 (0:00:00.353) 0:10:11.336 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Friday 30 January 2026 18:46:32 -0500 (0:00:00.131) 0:10:11.467 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Friday 30 January 2026 18:46:32 -0500 (0:00:00.089) 0:10:11.557 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Friday 30 January 2026 18:46:32 -0500 (0:00:00.085) 0:10:11.643 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Friday 30 January 2026 18:46:32 -0500 (0:00:00.042) 0:10:11.686 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Friday 30 January 2026 18:46:32 -0500 (0:00:00.036) 0:10:11.722 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Friday 30 January 2026 18:46:32 -0500 (0:00:00.092) 0:10:11.815 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Friday 30 January 2026 18:46:32 -0500 (0:00:00.143) 0:10:11.958 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Friday 30 January 2026 18:46:32 -0500 (0:00:00.044) 0:10:12.003 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Friday 30 January 2026 18:46:32 -0500 (0:00:00.083) 0:10:12.086 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Friday 30 January 2026 18:46:32 -0500 (0:00:00.115) 0:10:12.201 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:91 Friday 30 January 2026 18:46:32 -0500 (0:00:00.059) 0:10:12.260 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Friday 30 January 2026 18:46:33 -0500 (0:00:00.184) 0:10:12.444 ******** skipping: [managed-node2] => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/sda1', '_raw_device': '/dev/sda1', '_mount_id': 'UUID=213e64bb-74a7-4762-88f7-2ea4a358452a', '_kernel_device': '/dev/sda1', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_lvmraid_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_lvmraid_volume": { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=213e64bb-74a7-4762-88f7-2ea4a358452a", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:94 Friday 30 January 2026 18:46:33 -0500 (0:00:00.246) 0:10:12.691 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node2 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Friday 30 January 2026 18:46:33 -0500 (0:00:00.300) 0:10:12.991 ******** skipping: [managed-node2] => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/sda1', '_raw_device': '/dev/sda1', '_mount_id': 'UUID=213e64bb-74a7-4762-88f7-2ea4a358452a', '_kernel_device': '/dev/sda1', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_thin_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_thin_volume": { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=213e64bb-74a7-4762-88f7-2ea4a358452a", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check member encryption] ************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:97 Friday 30 January 2026 18:46:33 -0500 (0:00:00.265) 0:10:13.256 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node2 TASK [Set test variables] ****************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Friday 30 January 2026 18:46:34 -0500 (0:00:00.312) 0:10:13.568 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Friday 30 January 2026 18:46:34 -0500 (0:00:00.112) 0:10:13.681 ******** TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Friday 30 January 2026 18:46:34 -0500 (0:00:00.141) 0:10:13.823 ******** TASK [Clear test variables] **************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Friday 30 January 2026 18:46:34 -0500 (0:00:00.139) 0:10:13.962 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:100 Friday 30 January 2026 18:46:34 -0500 (0:00:00.092) 0:10:14.054 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Friday 30 January 2026 18:46:35 -0500 (0:00:00.419) 0:10:14.474 ******** skipping: [managed-node2] => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/sda1', '_raw_device': '/dev/sda1', '_mount_id': 'UUID=213e64bb-74a7-4762-88f7-2ea4a358452a', '_kernel_device': '/dev/sda1', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_vdo_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_vdo_volume": { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=213e64bb-74a7-4762-88f7-2ea4a358452a", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:103 Friday 30 January 2026 18:46:35 -0500 (0:00:00.483) 0:10:14.957 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node2 TASK [Get stratis pool information] ******************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Friday 30 January 2026 18:46:36 -0500 (0:00:00.679) 0:10:15.637 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print script output] ***************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Friday 30 January 2026 18:46:36 -0500 (0:00:00.140) 0:10:15.777 ******** skipping: [managed-node2] => {} TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:19 Friday 30 January 2026 18:46:36 -0500 (0:00:00.197) 0:10:15.974 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:23 Friday 30 January 2026 18:46:36 -0500 (0:00:00.153) 0:10:16.128 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:30 Friday 30 January 2026 18:46:37 -0500 (0:00:00.274) 0:10:16.403 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:39 Friday 30 January 2026 18:46:37 -0500 (0:00:00.286) 0:10:16.689 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:49 Friday 30 January 2026 18:46:37 -0500 (0:00:00.147) 0:10:16.836 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:106 Friday 30 January 2026 18:46:37 -0500 (0:00:00.133) 0:10:16.970 ******** ok: [managed-node2] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Friday 30 January 2026 18:46:37 -0500 (0:00:00.107) 0:10:17.077 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node2 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Friday 30 January 2026 18:46:38 -0500 (0:00:00.381) 0:10:17.459 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Friday 30 January 2026 18:46:38 -0500 (0:00:00.240) 0:10:17.700 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node2 included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node2 included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node2 included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node2 included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node2 included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node2 included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node2 included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Friday 30 January 2026 18:46:39 -0500 (0:00:00.970) 0:10:18.671 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_device_path": "/dev/sda1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Friday 30 January 2026 18:46:39 -0500 (0:00:00.032) 0:10:18.703 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Friday 30 January 2026 18:46:39 -0500 (0:00:00.103) 0:10:18.807 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Friday 30 January 2026 18:46:39 -0500 (0:00:00.173) 0:10:18.980 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Friday 30 January 2026 18:46:39 -0500 (0:00:00.209) 0:10:19.189 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Friday 30 January 2026 18:46:40 -0500 (0:00:00.196) 0:10:19.386 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Friday 30 January 2026 18:46:40 -0500 (0:00:00.204) 0:10:19.591 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Friday 30 January 2026 18:46:40 -0500 (0:00:00.287) 0:10:19.879 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Friday 30 January 2026 18:46:40 -0500 (0:00:00.241) 0:10:20.120 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Friday 30 January 2026 18:46:41 -0500 (0:00:00.208) 0:10:20.329 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Friday 30 January 2026 18:46:41 -0500 (0:00:00.195) 0:10:20.524 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Friday 30 January 2026 18:46:41 -0500 (0:00:00.201) 0:10:20.725 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "UUID=213e64bb-74a7-4762-88f7-2ea4a358452a " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Friday 30 January 2026 18:46:41 -0500 (0:00:00.528) 0:10:21.254 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Friday 30 January 2026 18:46:42 -0500 (0:00:00.246) 0:10:21.501 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Friday 30 January 2026 18:46:42 -0500 (0:00:00.265) 0:10:21.766 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Friday 30 January 2026 18:46:42 -0500 (0:00:00.163) 0:10:21.930 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Friday 30 January 2026 18:46:42 -0500 (0:00:00.258) 0:10:22.189 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Friday 30 January 2026 18:46:43 -0500 (0:00:00.233) 0:10:22.423 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Friday 30 January 2026 18:46:43 -0500 (0:00:00.257) 0:10:22.680 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Friday 30 January 2026 18:46:43 -0500 (0:00:00.278) 0:10:22.959 ******** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1769816763.5664926, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1769816763.5664926, "dev": 6, "device_type": 2049, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 175332, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1769816763.5664926, "nlink": 1, "path": "/dev/sda1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Friday 30 January 2026 18:46:45 -0500 (0:00:01.407) 0:10:24.367 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Friday 30 January 2026 18:46:45 -0500 (0:00:00.291) 0:10:24.658 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Friday 30 January 2026 18:46:45 -0500 (0:00:00.134) 0:10:24.793 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Friday 30 January 2026 18:46:45 -0500 (0:00:00.223) 0:10:25.017 ******** ok: [managed-node2] => { "ansible_facts": { "st_volume_type": "partition" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Friday 30 January 2026 18:46:45 -0500 (0:00:00.242) 0:10:25.259 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Friday 30 January 2026 18:46:46 -0500 (0:00:00.281) 0:10:25.541 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Friday 30 January 2026 18:46:46 -0500 (0:00:00.273) 0:10:25.814 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Friday 30 January 2026 18:46:46 -0500 (0:00:00.231) 0:10:26.046 ******** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Friday 30 January 2026 18:46:50 -0500 (0:00:03.855) 0:10:29.901 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Friday 30 January 2026 18:46:50 -0500 (0:00:00.163) 0:10:30.065 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Friday 30 January 2026 18:46:51 -0500 (0:00:00.238) 0:10:30.303 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Friday 30 January 2026 18:46:51 -0500 (0:00:00.303) 0:10:30.607 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Friday 30 January 2026 18:46:51 -0500 (0:00:00.229) 0:10:30.836 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Friday 30 January 2026 18:46:51 -0500 (0:00:00.307) 0:10:31.144 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Friday 30 January 2026 18:46:52 -0500 (0:00:00.186) 0:10:31.330 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Friday 30 January 2026 18:46:52 -0500 (0:00:00.299) 0:10:31.629 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Friday 30 January 2026 18:46:52 -0500 (0:00:00.260) 0:10:31.889 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Friday 30 January 2026 18:46:52 -0500 (0:00:00.271) 0:10:32.161 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Friday 30 January 2026 18:46:53 -0500 (0:00:00.286) 0:10:32.447 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Friday 30 January 2026 18:46:53 -0500 (0:00:00.276) 0:10:32.724 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Friday 30 January 2026 18:46:53 -0500 (0:00:00.194) 0:10:32.918 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Friday 30 January 2026 18:46:53 -0500 (0:00:00.164) 0:10:33.082 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Friday 30 January 2026 18:46:53 -0500 (0:00:00.151) 0:10:33.233 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Friday 30 January 2026 18:46:54 -0500 (0:00:00.226) 0:10:33.460 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Friday 30 January 2026 18:46:54 -0500 (0:00:00.144) 0:10:33.605 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Friday 30 January 2026 18:46:54 -0500 (0:00:00.272) 0:10:33.877 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Friday 30 January 2026 18:46:54 -0500 (0:00:00.270) 0:10:34.148 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Friday 30 January 2026 18:46:55 -0500 (0:00:00.234) 0:10:34.383 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Friday 30 January 2026 18:46:55 -0500 (0:00:00.171) 0:10:34.554 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Friday 30 January 2026 18:46:55 -0500 (0:00:00.282) 0:10:34.836 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Friday 30 January 2026 18:46:55 -0500 (0:00:00.197) 0:10:35.034 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Friday 30 January 2026 18:46:55 -0500 (0:00:00.221) 0:10:35.256 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Friday 30 January 2026 18:46:56 -0500 (0:00:00.246) 0:10:35.502 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Friday 30 January 2026 18:46:56 -0500 (0:00:00.183) 0:10:35.685 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Friday 30 January 2026 18:46:56 -0500 (0:00:00.155) 0:10:35.841 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Friday 30 January 2026 18:46:56 -0500 (0:00:00.212) 0:10:36.054 ******** ok: [managed-node2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Friday 30 January 2026 18:46:56 -0500 (0:00:00.195) 0:10:36.250 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Friday 30 January 2026 18:46:57 -0500 (0:00:00.181) 0:10:36.431 ******** skipping: [managed-node2] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Friday 30 January 2026 18:46:57 -0500 (0:00:00.152) 0:10:36.584 ******** skipping: [managed-node2] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Friday 30 January 2026 18:46:57 -0500 (0:00:00.274) 0:10:36.859 ******** skipping: [managed-node2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Friday 30 January 2026 18:46:57 -0500 (0:00:00.138) 0:10:36.997 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Friday 30 January 2026 18:46:57 -0500 (0:00:00.110) 0:10:37.107 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Friday 30 January 2026 18:46:58 -0500 (0:00:00.318) 0:10:37.425 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Friday 30 January 2026 18:46:58 -0500 (0:00:00.157) 0:10:37.583 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Friday 30 January 2026 18:46:58 -0500 (0:00:00.299) 0:10:37.882 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Friday 30 January 2026 18:46:58 -0500 (0:00:00.316) 0:10:38.199 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Friday 30 January 2026 18:46:59 -0500 (0:00:00.287) 0:10:38.486 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Friday 30 January 2026 18:46:59 -0500 (0:00:00.136) 0:10:38.622 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Friday 30 January 2026 18:46:59 -0500 (0:00:00.246) 0:10:38.869 ******** skipping: [managed-node2] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Friday 30 January 2026 18:46:59 -0500 (0:00:00.284) 0:10:39.154 ******** skipping: [managed-node2] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Friday 30 January 2026 18:47:00 -0500 (0:00:00.260) 0:10:39.414 ******** skipping: [managed-node2] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Friday 30 January 2026 18:47:00 -0500 (0:00:00.262) 0:10:39.677 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Friday 30 January 2026 18:47:00 -0500 (0:00:00.241) 0:10:39.919 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Friday 30 January 2026 18:47:00 -0500 (0:00:00.285) 0:10:40.204 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Friday 30 January 2026 18:47:01 -0500 (0:00:00.295) 0:10:40.499 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Friday 30 January 2026 18:47:01 -0500 (0:00:00.274) 0:10:40.773 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Friday 30 January 2026 18:47:01 -0500 (0:00:00.281) 0:10:41.055 ******** ok: [managed-node2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Friday 30 January 2026 18:47:01 -0500 (0:00:00.146) 0:10:41.202 ******** ok: [managed-node2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Friday 30 January 2026 18:47:02 -0500 (0:00:00.191) 0:10:41.393 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Friday 30 January 2026 18:47:02 -0500 (0:00:00.186) 0:10:41.580 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Friday 30 January 2026 18:47:02 -0500 (0:00:00.292) 0:10:41.872 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Friday 30 January 2026 18:47:02 -0500 (0:00:00.190) 0:10:42.063 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Friday 30 January 2026 18:47:03 -0500 (0:00:00.319) 0:10:42.382 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Friday 30 January 2026 18:47:03 -0500 (0:00:00.146) 0:10:42.529 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Friday 30 January 2026 18:47:03 -0500 (0:00:00.246) 0:10:42.775 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Friday 30 January 2026 18:47:03 -0500 (0:00:00.179) 0:10:42.955 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Friday 30 January 2026 18:47:03 -0500 (0:00:00.147) 0:10:43.102 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Friday 30 January 2026 18:47:03 -0500 (0:00:00.105) 0:10:43.208 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Friday 30 January 2026 18:47:04 -0500 (0:00:00.253) 0:10:43.461 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Friday 30 January 2026 18:47:04 -0500 (0:00:00.135) 0:10:43.596 ******** changed: [managed-node2] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode - 4] ****************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:296 Friday 30 January 2026 18:47:05 -0500 (0:00:01.518) 0:10:45.114 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node2 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Friday 30 January 2026 18:47:06 -0500 (0:00:00.556) 0:10:45.670 ******** ok: [managed-node2] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Friday 30 January 2026 18:47:06 -0500 (0:00:00.319) 0:10:45.989 ******** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Friday 30 January 2026 18:47:06 -0500 (0:00:00.238) 0:10:46.228 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Friday 30 January 2026 18:47:07 -0500 (0:00:00.286) 0:10:46.514 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Friday 30 January 2026 18:47:07 -0500 (0:00:00.265) 0:10:46.780 ******** skipping: [managed-node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node2] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node2] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Friday 30 January 2026 18:47:08 -0500 (0:00:00.534) 0:10:47.314 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Friday 30 January 2026 18:47:08 -0500 (0:00:00.243) 0:10:47.558 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Friday 30 January 2026 18:47:08 -0500 (0:00:00.311) 0:10:47.869 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Friday 30 January 2026 18:47:08 -0500 (0:00:00.242) 0:10:48.111 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Friday 30 January 2026 18:47:09 -0500 (0:00:00.206) 0:10:48.317 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Friday 30 January 2026 18:47:10 -0500 (0:00:01.113) 0:10:49.430 ******** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Friday 30 January 2026 18:47:14 -0500 (0:00:04.622) 0:10:54.053 ******** ok: [managed-node2] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": true, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Friday 30 January 2026 18:47:15 -0500 (0:00:00.346) 0:10:54.399 ******** ok: [managed-node2] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Friday 30 January 2026 18:47:15 -0500 (0:00:00.198) 0:10:54.598 ******** ok: [managed-node2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Friday 30 January 2026 18:47:21 -0500 (0:00:05.901) 0:11:00.500 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Friday 30 January 2026 18:47:21 -0500 (0:00:00.303) 0:11:00.803 ******** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Friday 30 January 2026 18:47:21 -0500 (0:00:00.144) 0:11:00.948 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Friday 30 January 2026 18:47:21 -0500 (0:00:00.253) 0:11:01.201 ******** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Friday 30 January 2026 18:47:22 -0500 (0:00:00.128) 0:11:01.329 ******** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Friday 30 January 2026 18:47:26 -0500 (0:00:04.205) 0:11:05.535 ******** ok: [managed-node2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luk...d8967\\x2d4afd\\x2d8c12\\x2d6d7654b37892.service": { "name": "systemd-cryptsetup@luk...d8967\\x2d4afd\\x2d8c12\\x2d6d7654b37892.service", "source": "systemd", "state": "inactive", "status": "generated" }, "systemd-cryptsetup@luks\\x2de6b9be7c\\x2d8967\\x2d4afd\\x2d8c12\\x2d6d7654b37892.service": { "name": "systemd-cryptsetup@luks\\x2de6b9be7c\\x2d8967\\x2d4afd\\x2d8c12\\x2d6d7654b37892.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Friday 30 January 2026 18:47:29 -0500 (0:00:02.947) 0:11:08.482 ******** ok: [managed-node2] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2de6b9be7c\\x2d8967\\x2d4afd\\x2d8c12\\x2d6d7654b37892.service", "systemd-cryptsetup@luk...d8967\\x2d4afd\\x2d8c12\\x2d6d7654b37892.service" ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Friday 30 January 2026 18:47:29 -0500 (0:00:00.433) 0:11:08.916 ******** changed: [managed-node2] => (item=systemd-cryptsetup@luks\x2de6b9be7c\x2d8967\x2d4afd\x2d8c12\x2d6d7654b37892.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2de6b9be7c\\x2d8967\\x2d4afd\\x2d8c12\\x2d6d7654b37892.service", "name": "systemd-cryptsetup@luks\\x2de6b9be7c\\x2d8967\\x2d4afd\\x2d8c12\\x2d6d7654b37892.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "dev-sda1.device system-systemd\\x2dcryptsetup.slice systemd-journald.socket cryptsetup-pre.target", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "cryptsetup.target umount.target", "BindsTo": "dev-sda1.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-e6b9be7c-8967-4afd-8c12-6d7654b37892", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-e6b9be7c-8967-4afd-8c12-6d7654b37892 /dev/sda1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-e6b9be7c-8967-4afd-8c12-6d7654b37892 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2de6b9be7c\\x2d8967\\x2d4afd\\x2d8c12\\x2d6d7654b37892.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2de6b9be7c\\x2d8967\\x2d4afd\\x2d8c12\\x2d6d7654b37892.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2de6b9be7c\\x2d8967\\x2d4afd\\x2d8c12\\x2d6d7654b37892.service", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Fri 2026-01-30 18:45:18 EST", "StateChangeTimestampMonotonic": "2125016149", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "infinity", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "WantedBy": "dev-sda1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node2] => (item=systemd-cryptsetup@luk...d8967\x2d4afd\x2d8c12\x2d6d7654b37892.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...d8967\\x2d4afd\\x2d8c12\\x2d6d7654b37892.service", "name": "systemd-cryptsetup@luk...d8967\\x2d4afd\\x2d8c12\\x2d6d7654b37892.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...d8967\\x2d4afd\\x2d8c12\\x2d6d7654b37892.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...d8967\\x2d4afd\\x2d8c12\\x2d6d7654b37892.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.NoSuchUnit \"Unit systemd-cryptsetup@luk...d8967\\x2d4afd\\x2d8c12\\x2d6d7654b37892.service not found.\"", "LoadState": "not-found", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...d8967\\x2d4afd\\x2d8c12\\x2d6d7654b37892.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Friday 30 January 2026 18:47:33 -0500 (0:00:03.469) 0:11:12.386 ******** fatal: [managed-node2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'sda1' in safe mode due to adding encryption TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111 Friday 30 January 2026 18:47:38 -0500 (0:00:05.791) 0:11:18.177 ******** fatal: [managed-node2]: FAILED! => { "changed": false } MSG: {'msg': "cannot remove existing formatting on device 'sda1' in safe mode due to adding encryption", 'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'invocation': {'module_args': {'pools': [{'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'partition', 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None}]}], 'volumes': [], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': True, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Friday 30 January 2026 18:47:39 -0500 (0:00:00.115) 0:11:18.293 ******** changed: [managed-node2] => (item=systemd-cryptsetup@luks\x2de6b9be7c\x2d8967\x2d4afd\x2d8c12\x2d6d7654b37892.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2de6b9be7c\\x2d8967\\x2d4afd\\x2d8c12\\x2d6d7654b37892.service", "name": "systemd-cryptsetup@luks\\x2de6b9be7c\\x2d8967\\x2d4afd\\x2d8c12\\x2d6d7654b37892.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2de6b9be7c\\x2d8967\\x2d4afd\\x2d8c12\\x2d6d7654b37892.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2de6b9be7c\\x2d8967\\x2d4afd\\x2d8c12\\x2d6d7654b37892.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2de6b9be7c\\x2d8967\\x2d4afd\\x2d8c12\\x2d6d7654b37892.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2de6b9be7c\\x2d8967\\x2d4afd\\x2d8c12\\x2d6d7654b37892.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node2] => (item=systemd-cryptsetup@luk...d8967\x2d4afd\x2d8c12\x2d6d7654b37892.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...d8967\\x2d4afd\\x2d8c12\\x2d6d7654b37892.service", "name": "systemd-cryptsetup@luk...d8967\\x2d4afd\\x2d8c12\\x2d6d7654b37892.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...d8967\\x2d4afd\\x2d8c12\\x2d6d7654b37892.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...d8967\\x2d4afd\\x2d8c12\\x2d6d7654b37892.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luk...d8967\\x2d4afd\\x2d8c12\\x2d6d7654b37892.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...d8967\\x2d4afd\\x2d8c12\\x2d6d7654b37892.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:23 Friday 30 January 2026 18:47:41 -0500 (0:00:02.898) 0:11:21.192 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:28 Friday 30 January 2026 18:47:42 -0500 (0:00:00.241) 0:11:21.433 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:39 Friday 30 January 2026 18:47:42 -0500 (0:00:00.248) 0:11:21.682 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Friday 30 January 2026 18:47:42 -0500 (0:00:00.169) 0:11:21.852 ******** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1769816825.582338, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1769816825.582338, "dev": 2049, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1769816825.582338, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "1583476250", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Friday 30 January 2026 18:47:43 -0500 (0:00:01.149) 0:11:23.001 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Create a key file] ******************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:323 Friday 30 January 2026 18:47:43 -0500 (0:00:00.202) 0:11:23.203 ******** ok: [managed-node2] => { "changed": false, "gid": 0, "group": "root", "mode": "0600", "owner": "root", "path": "/tmp/storage_test4gmic4o6lukskey", "secontext": "unconfined_u:object_r:user_tmp_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Write the key into the key file] ***************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:330 Friday 30 January 2026 18:47:45 -0500 (0:00:01.638) 0:11:24.841 ******** ok: [managed-node2] => { "changed": false, "checksum": "7a4dff3752e2baf5617c57eaac048e2b95e8af91", "dest": "/tmp/storage_test4gmic4o6lukskey", "gid": 0, "group": "root", "md5sum": "4ac07b967150835c00d0865161e48744", "mode": "0600", "owner": "root", "secontext": "unconfined_u:object_r:user_tmp_t:s0", "size": 32, "src": "/root/.ansible/tmp/ansible-tmp-1769816865.8142536-183698-81888615841185/source", "state": "file", "uid": 0 } TASK [Add encryption to the volume - 2] **************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:337 Friday 30 January 2026 18:47:49 -0500 (0:00:03.945) 0:11:28.787 ******** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Friday 30 January 2026 18:47:49 -0500 (0:00:00.258) 0:11:29.046 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Friday 30 January 2026 18:47:50 -0500 (0:00:00.308) 0:11:29.354 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Friday 30 January 2026 18:47:50 -0500 (0:00:00.239) 0:11:29.593 ******** skipping: [managed-node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node2] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node2] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Friday 30 January 2026 18:47:50 -0500 (0:00:00.415) 0:11:30.009 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Friday 30 January 2026 18:47:50 -0500 (0:00:00.269) 0:11:30.279 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Friday 30 January 2026 18:47:51 -0500 (0:00:00.204) 0:11:30.483 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Friday 30 January 2026 18:47:51 -0500 (0:00:00.146) 0:11:30.630 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Friday 30 January 2026 18:47:51 -0500 (0:00:00.227) 0:11:30.858 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Friday 30 January 2026 18:47:52 -0500 (0:00:00.452) 0:11:31.311 ******** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Friday 30 January 2026 18:47:56 -0500 (0:00:04.223) 0:11:35.534 ******** ok: [managed-node2] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": true, "encryption_key": "/tmp/storage_test4gmic4o6lukskey", "encryption_luks_version": "luks2", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Friday 30 January 2026 18:47:56 -0500 (0:00:00.280) 0:11:35.814 ******** ok: [managed-node2] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Friday 30 January 2026 18:47:56 -0500 (0:00:00.282) 0:11:36.097 ******** ok: [managed-node2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Friday 30 January 2026 18:48:02 -0500 (0:00:05.334) 0:11:41.432 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Friday 30 January 2026 18:48:02 -0500 (0:00:00.370) 0:11:41.802 ******** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Friday 30 January 2026 18:48:02 -0500 (0:00:00.242) 0:11:42.044 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Friday 30 January 2026 18:48:03 -0500 (0:00:00.255) 0:11:42.300 ******** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Friday 30 January 2026 18:48:03 -0500 (0:00:00.199) 0:11:42.500 ******** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Friday 30 January 2026 18:48:07 -0500 (0:00:04.080) 0:11:46.580 ******** ok: [managed-node2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Friday 30 January 2026 18:48:10 -0500 (0:00:02.952) 0:11:49.533 ******** ok: [managed-node2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Friday 30 January 2026 18:48:10 -0500 (0:00:00.288) 0:11:49.822 ******** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Friday 30 January 2026 18:48:10 -0500 (0:00:00.122) 0:11:49.944 ******** changed: [managed-node2] => { "actions": [ { "action": "destroy format", "device": "/dev/sda1", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-ab99b751-7175-4a26-bc8f-cc52ceb754a6", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-ab99b751-7175-4a26-bc8f-cc52ceb754a6", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-ab99b751-7175-4a26-bc8f-cc52ceb754a6", "password": "/tmp/storage_test4gmic4o6lukskey", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-ab99b751-7175-4a26-bc8f-cc52ceb754a6" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=213e64bb-74a7-4762-88f7-2ea4a358452a", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-ab99b751-7175-4a26-bc8f-cc52ceb754a6", "state": "mounted" } ], "packages": [ "xfsprogs", "cryptsetup" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-ab99b751-7175-4a26-bc8f-cc52ceb754a6", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-ab99b751-7175-4a26-bc8f-cc52ceb754a6", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_test4gmic4o6lukskey", "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 Friday 30 January 2026 18:48:24 -0500 (0:00:13.575) 0:12:03.520 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Friday 30 January 2026 18:48:24 -0500 (0:00:00.116) 0:12:03.637 ******** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1769816775.5024629, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "b402d71ef3a7e9828d976f3ee663d01decbdbd74", "ctime": 1769816775.4994628, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 322961545, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1769816775.4994628, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1413, "uid": 0, "version": "3166550135", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:97 Friday 30 January 2026 18:48:25 -0500 (0:00:01.185) 0:12:04.822 ******** ok: [managed-node2] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Friday 30 January 2026 18:48:27 -0500 (0:00:01.628) 0:12:06.451 ******** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Friday 30 January 2026 18:48:27 -0500 (0:00:00.093) 0:12:06.544 ******** ok: [managed-node2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/sda1", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-ab99b751-7175-4a26-bc8f-cc52ceb754a6", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-ab99b751-7175-4a26-bc8f-cc52ceb754a6", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-ab99b751-7175-4a26-bc8f-cc52ceb754a6", "password": "/tmp/storage_test4gmic4o6lukskey", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-ab99b751-7175-4a26-bc8f-cc52ceb754a6" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=213e64bb-74a7-4762-88f7-2ea4a358452a", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-ab99b751-7175-4a26-bc8f-cc52ceb754a6", "state": "mounted" } ], "packages": [ "xfsprogs", "cryptsetup" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-ab99b751-7175-4a26-bc8f-cc52ceb754a6", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-ab99b751-7175-4a26-bc8f-cc52ceb754a6", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_test4gmic4o6lukskey", "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:130 Friday 30 January 2026 18:48:27 -0500 (0:00:00.222) 0:12:06.767 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-ab99b751-7175-4a26-bc8f-cc52ceb754a6", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-ab99b751-7175-4a26-bc8f-cc52ceb754a6", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_test4gmic4o6lukskey", "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:134 Friday 30 January 2026 18:48:27 -0500 (0:00:00.299) 0:12:07.066 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:150 Friday 30 January 2026 18:48:28 -0500 (0:00:00.233) 0:12:07.300 ******** changed: [managed-node2] => (item={'src': 'UUID=213e64bb-74a7-4762-88f7-2ea4a358452a', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=213e64bb-74a7-4762-88f7-2ea4a358452a", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=213e64bb-74a7-4762-88f7-2ea4a358452a" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Friday 30 January 2026 18:48:29 -0500 (0:00:01.620) 0:12:08.920 ******** ok: [managed-node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166 Friday 30 January 2026 18:48:31 -0500 (0:00:01.880) 0:12:10.800 ******** changed: [managed-node2] => (item={'src': '/dev/mapper/luks-ab99b751-7175-4a26-bc8f-cc52ceb754a6', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-ab99b751-7175-4a26-bc8f-cc52ceb754a6", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-ab99b751-7175-4a26-bc8f-cc52ceb754a6" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:177 Friday 30 January 2026 18:48:32 -0500 (0:00:01.435) 0:12:12.236 ******** skipping: [managed-node2] => (item={'src': '/dev/mapper/luks-ab99b751-7175-4a26-bc8f-cc52ceb754a6', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-ab99b751-7175-4a26-bc8f-cc52ceb754a6", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:189 Friday 30 January 2026 18:48:33 -0500 (0:00:00.456) 0:12:12.693 ******** ok: [managed-node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 Friday 30 January 2026 18:48:35 -0500 (0:00:01.777) 0:12:14.470 ******** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1769816786.8154347, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1769816779.850452, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 92274889, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1769816779.8484519, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "360883342", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:202 Friday 30 January 2026 18:48:36 -0500 (0:00:01.670) 0:12:16.141 ******** changed: [managed-node2] => (item={'backing_device': '/dev/sda1', 'name': 'luks-ab99b751-7175-4a26-bc8f-cc52ceb754a6', 'password': '/tmp/storage_test4gmic4o6lukskey', 'state': 'present'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda1", "name": "luks-ab99b751-7175-4a26-bc8f-cc52ceb754a6", "password": "/tmp/storage_test4gmic4o6lukskey", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 Friday 30 January 2026 18:48:38 -0500 (0:00:01.499) 0:12:17.640 ******** ok: [managed-node2] TASK [Verify role results - 6] ************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:355 Friday 30 January 2026 18:48:40 -0500 (0:00:01.745) 0:12:19.385 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node2 TASK [Print out pool information] ********************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Friday 30 January 2026 18:48:40 -0500 (0:00:00.256) 0:12:19.642 ******** ok: [managed-node2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-ab99b751-7175-4a26-bc8f-cc52ceb754a6", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-ab99b751-7175-4a26-bc8f-cc52ceb754a6", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_test4gmic4o6lukskey", "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Friday 30 January 2026 18:48:40 -0500 (0:00:00.298) 0:12:19.941 ******** skipping: [managed-node2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Friday 30 January 2026 18:48:40 -0500 (0:00:00.295) 0:12:20.236 ******** ok: [managed-node2] => { "changed": false, "info": { "/dev/mapper/luks-ab99b751-7175-4a26-bc8f-cc52ceb754a6": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-ab99b751-7175-4a26-bc8f-cc52ceb754a6", "size": "4G", "type": "crypt", "uuid": "e343dd83-4e9c-479c-8a94-b294ad15238f" }, "/dev/sda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/sda1", "size": "4G", "type": "partition", "uuid": "ab99b751-7175-4a26-bc8f-cc52ceb754a6" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Friday 30 January 2026 18:48:42 -0500 (0:00:01.487) 0:12:21.724 ******** ok: [managed-node2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002385", "end": "2026-01-30 18:48:43.551884", "rc": 0, "start": "2026-01-30 18:48:43.549499" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-ab99b751-7175-4a26-bc8f-cc52ceb754a6 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Friday 30 January 2026 18:48:43 -0500 (0:00:01.458) 0:12:23.182 ******** ok: [managed-node2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002485", "end": "2026-01-30 18:48:45.172773", "failed_when_result": false, "rc": 0, "start": "2026-01-30 18:48:45.170288" } STDOUT: luks-ab99b751-7175-4a26-bc8f-cc52ceb754a6 /dev/sda1 /tmp/storage_test4gmic4o6lukskey TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Friday 30 January 2026 18:48:45 -0500 (0:00:01.604) 0:12:24.787 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Friday 30 January 2026 18:48:45 -0500 (0:00:00.439) 0:12:25.227 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Friday 30 January 2026 18:48:46 -0500 (0:00:00.209) 0:12:25.436 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Friday 30 January 2026 18:48:46 -0500 (0:00:00.234) 0:12:25.671 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Friday 30 January 2026 18:48:46 -0500 (0:00:00.276) 0:12:25.948 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node2 included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node2 TASK [Set test variables] ****************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Friday 30 January 2026 18:48:47 -0500 (0:00:00.395) 0:12:26.344 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Friday 30 January 2026 18:48:47 -0500 (0:00:00.132) 0:12:26.477 ******** TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Friday 30 January 2026 18:48:47 -0500 (0:00:00.095) 0:12:26.572 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Friday 30 January 2026 18:48:47 -0500 (0:00:00.212) 0:12:26.784 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Friday 30 January 2026 18:48:47 -0500 (0:00:00.157) 0:12:26.941 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Friday 30 January 2026 18:48:47 -0500 (0:00:00.299) 0:12:27.241 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type - 2] ************************************************ task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Friday 30 January 2026 18:48:48 -0500 (0:00:00.290) 0:12:27.531 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type - 3] ************************************************ task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Friday 30 January 2026 18:48:48 -0500 (0:00:00.272) 0:12:27.803 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:55 Friday 30 January 2026 18:48:48 -0500 (0:00:00.338) 0:12:28.141 ******** TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:68 Friday 30 January 2026 18:48:49 -0500 (0:00:00.223) 0:12:28.365 ******** ok: [managed-node2] => { "changed": false, "failed_when_result": false, "rc": 1 } STDERR: Shared connection to 10.31.47.227 closed. MSG: non-zero return code TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:78 Friday 30 January 2026 18:48:50 -0500 (0:00:01.597) 0:12:29.963 ******** TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:88 Friday 30 January 2026 18:48:50 -0500 (0:00:00.118) 0:12:30.081 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node2 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Friday 30 January 2026 18:48:51 -0500 (0:00:00.250) 0:12:30.332 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Friday 30 January 2026 18:48:51 -0500 (0:00:00.243) 0:12:30.576 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Friday 30 January 2026 18:48:51 -0500 (0:00:00.113) 0:12:30.714 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Friday 30 January 2026 18:48:51 -0500 (0:00:00.224) 0:12:30.938 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Friday 30 January 2026 18:48:51 -0500 (0:00:00.252) 0:12:31.191 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Friday 30 January 2026 18:48:52 -0500 (0:00:00.234) 0:12:31.426 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Friday 30 January 2026 18:48:52 -0500 (0:00:00.250) 0:12:31.677 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Friday 30 January 2026 18:48:52 -0500 (0:00:00.178) 0:12:31.856 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Friday 30 January 2026 18:48:52 -0500 (0:00:00.331) 0:12:32.187 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Friday 30 January 2026 18:48:53 -0500 (0:00:00.176) 0:12:32.364 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Friday 30 January 2026 18:48:53 -0500 (0:00:00.239) 0:12:32.604 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:91 Friday 30 January 2026 18:48:53 -0500 (0:00:00.140) 0:12:32.745 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Friday 30 January 2026 18:48:53 -0500 (0:00:00.395) 0:12:33.141 ******** skipping: [managed-node2] => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': '/tmp/storage_test4gmic4o6lukskey', 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-ab99b751-7175-4a26-bc8f-cc52ceb754a6', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-ab99b751-7175-4a26-bc8f-cc52ceb754a6', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_lvmraid_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_lvmraid_volume": { "_device": "/dev/mapper/luks-ab99b751-7175-4a26-bc8f-cc52ceb754a6", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-ab99b751-7175-4a26-bc8f-cc52ceb754a6", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_test4gmic4o6lukskey", "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:94 Friday 30 January 2026 18:48:54 -0500 (0:00:00.314) 0:12:33.455 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node2 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Friday 30 January 2026 18:48:54 -0500 (0:00:00.298) 0:12:33.754 ******** skipping: [managed-node2] => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': '/tmp/storage_test4gmic4o6lukskey', 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-ab99b751-7175-4a26-bc8f-cc52ceb754a6', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-ab99b751-7175-4a26-bc8f-cc52ceb754a6', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_thin_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_thin_volume": { "_device": "/dev/mapper/luks-ab99b751-7175-4a26-bc8f-cc52ceb754a6", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-ab99b751-7175-4a26-bc8f-cc52ceb754a6", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_test4gmic4o6lukskey", "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check member encryption] ************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:97 Friday 30 January 2026 18:48:54 -0500 (0:00:00.141) 0:12:33.896 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node2 TASK [Set test variables] ****************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Friday 30 January 2026 18:48:54 -0500 (0:00:00.214) 0:12:34.110 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Friday 30 January 2026 18:48:54 -0500 (0:00:00.138) 0:12:34.248 ******** TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Friday 30 January 2026 18:48:55 -0500 (0:00:00.522) 0:12:34.771 ******** TASK [Clear test variables] **************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Friday 30 January 2026 18:48:55 -0500 (0:00:00.082) 0:12:34.854 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:100 Friday 30 January 2026 18:48:55 -0500 (0:00:00.144) 0:12:34.998 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Friday 30 January 2026 18:48:56 -0500 (0:00:00.384) 0:12:35.383 ******** skipping: [managed-node2] => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': '/tmp/storage_test4gmic4o6lukskey', 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-ab99b751-7175-4a26-bc8f-cc52ceb754a6', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-ab99b751-7175-4a26-bc8f-cc52ceb754a6', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_vdo_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_vdo_volume": { "_device": "/dev/mapper/luks-ab99b751-7175-4a26-bc8f-cc52ceb754a6", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-ab99b751-7175-4a26-bc8f-cc52ceb754a6", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_test4gmic4o6lukskey", "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:103 Friday 30 January 2026 18:48:56 -0500 (0:00:00.175) 0:12:35.559 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node2 TASK [Get stratis pool information] ******************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Friday 30 January 2026 18:48:56 -0500 (0:00:00.356) 0:12:35.915 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print script output] ***************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Friday 30 January 2026 18:48:57 -0500 (0:00:00.518) 0:12:36.433 ******** skipping: [managed-node2] => {} TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:19 Friday 30 January 2026 18:48:57 -0500 (0:00:00.271) 0:12:36.704 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:23 Friday 30 January 2026 18:48:57 -0500 (0:00:00.388) 0:12:37.092 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:30 Friday 30 January 2026 18:48:58 -0500 (0:00:00.263) 0:12:37.356 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:39 Friday 30 January 2026 18:48:58 -0500 (0:00:00.243) 0:12:37.600 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:49 Friday 30 January 2026 18:48:58 -0500 (0:00:00.333) 0:12:37.933 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:106 Friday 30 January 2026 18:48:58 -0500 (0:00:00.254) 0:12:38.187 ******** ok: [managed-node2] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Friday 30 January 2026 18:48:59 -0500 (0:00:00.226) 0:12:38.414 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node2 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Friday 30 January 2026 18:48:59 -0500 (0:00:00.420) 0:12:38.834 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Friday 30 January 2026 18:48:59 -0500 (0:00:00.260) 0:12:39.095 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node2 included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node2 included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node2 included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node2 included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node2 included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node2 included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node2 included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Friday 30 January 2026 18:49:00 -0500 (0:00:01.167) 0:12:40.262 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-ab99b751-7175-4a26-bc8f-cc52ceb754a6" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Friday 30 January 2026 18:49:01 -0500 (0:00:00.218) 0:12:40.480 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Friday 30 January 2026 18:49:01 -0500 (0:00:00.259) 0:12:40.740 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Friday 30 January 2026 18:49:01 -0500 (0:00:00.287) 0:12:41.027 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Friday 30 January 2026 18:49:02 -0500 (0:00:00.347) 0:12:41.375 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Friday 30 January 2026 18:49:02 -0500 (0:00:00.228) 0:12:41.604 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Friday 30 January 2026 18:49:02 -0500 (0:00:00.237) 0:12:41.841 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Friday 30 January 2026 18:49:02 -0500 (0:00:00.171) 0:12:42.013 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Friday 30 January 2026 18:49:02 -0500 (0:00:00.089) 0:12:42.103 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Friday 30 January 2026 18:49:02 -0500 (0:00:00.161) 0:12:42.265 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Friday 30 January 2026 18:49:03 -0500 (0:00:00.244) 0:12:42.509 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Friday 30 January 2026 18:49:03 -0500 (0:00:00.165) 0:12:42.674 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-ab99b751-7175-4a26-bc8f-cc52ceb754a6 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Friday 30 January 2026 18:49:03 -0500 (0:00:00.533) 0:12:43.208 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Friday 30 January 2026 18:49:04 -0500 (0:00:00.307) 0:12:43.516 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Friday 30 January 2026 18:49:04 -0500 (0:00:00.201) 0:12:43.718 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Friday 30 January 2026 18:49:04 -0500 (0:00:00.195) 0:12:43.913 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Friday 30 January 2026 18:49:04 -0500 (0:00:00.172) 0:12:44.086 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Friday 30 January 2026 18:49:04 -0500 (0:00:00.185) 0:12:44.271 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Friday 30 January 2026 18:49:05 -0500 (0:00:00.323) 0:12:44.595 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Friday 30 January 2026 18:49:05 -0500 (0:00:00.332) 0:12:44.927 ******** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1769816903.9001434, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1769816903.9001434, "dev": 6, "device_type": 2049, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 207430, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1769816903.9001434, "nlink": 1, "path": "/dev/sda1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Friday 30 January 2026 18:49:07 -0500 (0:00:01.398) 0:12:46.326 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Friday 30 January 2026 18:49:07 -0500 (0:00:00.232) 0:12:46.558 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Friday 30 January 2026 18:49:07 -0500 (0:00:00.257) 0:12:46.816 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Friday 30 January 2026 18:49:07 -0500 (0:00:00.222) 0:12:47.039 ******** ok: [managed-node2] => { "ansible_facts": { "st_volume_type": "partition" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Friday 30 January 2026 18:49:08 -0500 (0:00:00.324) 0:12:47.363 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Friday 30 January 2026 18:49:08 -0500 (0:00:00.182) 0:12:47.546 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Friday 30 January 2026 18:49:08 -0500 (0:00:00.244) 0:12:47.790 ******** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1769816904.051143, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1769816904.051143, "dev": 6, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 207519, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1769816904.051143, "nlink": 1, "path": "/dev/mapper/luks-ab99b751-7175-4a26-bc8f-cc52ceb754a6", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Friday 30 January 2026 18:49:09 -0500 (0:00:01.494) 0:12:49.284 ******** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Friday 30 January 2026 18:49:13 -0500 (0:00:03.952) 0:12:53.237 ******** ok: [managed-node2] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/sda1" ], "delta": "0:00:00.010090", "end": "2026-01-30 18:49:14.919404", "rc": 0, "start": "2026-01-30 18:49:14.909314" } STDOUT: LUKS header information Version: 2 Epoch: 3 Metadata area: 16384 [bytes] Keyslots area: 16744448 [bytes] UUID: ab99b751-7175-4a26-bc8f-cc52ceb754a6 Label: (no label) Subsystem: (no subsystem) Flags: (no flags) Data segments: 0: crypt offset: 16777216 [bytes] length: (whole device) cipher: aes-xts-plain64 sector: 512 [bytes] Keyslots: 0: luks2 Key: 512 bits Priority: normal Cipher: aes-xts-plain64 Cipher key: 512 bits PBKDF: argon2i Time cost: 4 Memory: 906589 Threads: 2 Salt: 5b d8 c9 39 9c 8a 1e 71 11 b7 34 dc 59 1d 38 2f ff e1 09 15 76 f7 58 63 58 c2 36 85 a4 47 b7 34 AF stripes: 4000 AF hash: sha256 Area offset:32768 [bytes] Area length:258048 [bytes] Digest ID: 0 Tokens: Digests: 0: pbkdf2 Hash: sha256 Iterations: 120029 Salt: 35 05 de cf e4 c7 53 9c 47 18 58 1b 77 b7 17 25 dd 52 ea d0 33 0c b7 45 21 cc 98 41 dd a4 21 b3 Digest: f9 6f 3a 17 6c 15 af c8 eb ce 14 f7 cf c6 76 26 9b b6 21 c2 e6 e1 f9 99 84 52 60 f9 6b 73 92 8f TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Friday 30 January 2026 18:49:15 -0500 (0:00:01.178) 0:12:54.416 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Friday 30 January 2026 18:49:15 -0500 (0:00:00.255) 0:12:54.671 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Friday 30 January 2026 18:49:15 -0500 (0:00:00.203) 0:12:54.874 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Friday 30 January 2026 18:49:15 -0500 (0:00:00.229) 0:12:55.104 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Friday 30 January 2026 18:49:16 -0500 (0:00:00.193) 0:12:55.298 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Friday 30 January 2026 18:49:16 -0500 (0:00:00.256) 0:12:55.555 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Friday 30 January 2026 18:49:16 -0500 (0:00:00.185) 0:12:55.740 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Friday 30 January 2026 18:49:16 -0500 (0:00:00.306) 0:12:56.047 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-ab99b751-7175-4a26-bc8f-cc52ceb754a6 /dev/sda1 /tmp/storage_test4gmic4o6lukskey" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "/tmp/storage_test4gmic4o6lukskey" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Friday 30 January 2026 18:49:17 -0500 (0:00:00.281) 0:12:56.329 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Friday 30 January 2026 18:49:17 -0500 (0:00:00.206) 0:12:56.535 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Friday 30 January 2026 18:49:17 -0500 (0:00:00.371) 0:12:56.907 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Friday 30 January 2026 18:49:17 -0500 (0:00:00.102) 0:12:57.009 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Friday 30 January 2026 18:49:17 -0500 (0:00:00.130) 0:12:57.140 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Friday 30 January 2026 18:49:18 -0500 (0:00:00.165) 0:12:57.306 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Friday 30 January 2026 18:49:18 -0500 (0:00:00.235) 0:12:57.541 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Friday 30 January 2026 18:49:18 -0500 (0:00:00.135) 0:12:57.677 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Friday 30 January 2026 18:49:18 -0500 (0:00:00.182) 0:12:57.860 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Friday 30 January 2026 18:49:18 -0500 (0:00:00.151) 0:12:58.012 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Friday 30 January 2026 18:49:18 -0500 (0:00:00.115) 0:12:58.128 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Friday 30 January 2026 18:49:18 -0500 (0:00:00.132) 0:12:58.260 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Friday 30 January 2026 18:49:19 -0500 (0:00:00.224) 0:12:58.485 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Friday 30 January 2026 18:49:19 -0500 (0:00:00.160) 0:12:58.646 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Friday 30 January 2026 18:49:19 -0500 (0:00:00.237) 0:12:58.883 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Friday 30 January 2026 18:49:19 -0500 (0:00:00.203) 0:12:59.087 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Friday 30 January 2026 18:49:19 -0500 (0:00:00.142) 0:12:59.229 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Friday 30 January 2026 18:49:20 -0500 (0:00:00.144) 0:12:59.373 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Friday 30 January 2026 18:49:20 -0500 (0:00:00.161) 0:12:59.534 ******** ok: [managed-node2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Friday 30 January 2026 18:49:20 -0500 (0:00:00.154) 0:12:59.689 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Friday 30 January 2026 18:49:20 -0500 (0:00:00.144) 0:12:59.834 ******** skipping: [managed-node2] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Friday 30 January 2026 18:49:20 -0500 (0:00:00.329) 0:13:00.163 ******** skipping: [managed-node2] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Friday 30 January 2026 18:49:21 -0500 (0:00:00.241) 0:13:00.404 ******** skipping: [managed-node2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Friday 30 January 2026 18:49:21 -0500 (0:00:00.183) 0:13:00.588 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Friday 30 January 2026 18:49:21 -0500 (0:00:00.138) 0:13:00.727 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Friday 30 January 2026 18:49:21 -0500 (0:00:00.117) 0:13:00.845 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Friday 30 January 2026 18:49:21 -0500 (0:00:00.103) 0:13:00.948 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Friday 30 January 2026 18:49:21 -0500 (0:00:00.098) 0:13:01.046 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Friday 30 January 2026 18:49:21 -0500 (0:00:00.093) 0:13:01.140 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Friday 30 January 2026 18:49:21 -0500 (0:00:00.074) 0:13:01.214 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Friday 30 January 2026 18:49:22 -0500 (0:00:00.128) 0:13:01.343 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Friday 30 January 2026 18:49:22 -0500 (0:00:00.174) 0:13:01.517 ******** skipping: [managed-node2] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Friday 30 January 2026 18:49:22 -0500 (0:00:00.242) 0:13:01.760 ******** skipping: [managed-node2] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Friday 30 January 2026 18:49:22 -0500 (0:00:00.229) 0:13:01.989 ******** skipping: [managed-node2] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Friday 30 January 2026 18:49:22 -0500 (0:00:00.257) 0:13:02.247 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Friday 30 January 2026 18:49:23 -0500 (0:00:00.213) 0:13:02.461 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Friday 30 January 2026 18:49:23 -0500 (0:00:00.279) 0:13:02.740 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Friday 30 January 2026 18:49:23 -0500 (0:00:00.366) 0:13:03.107 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Friday 30 January 2026 18:49:24 -0500 (0:00:00.351) 0:13:03.459 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Friday 30 January 2026 18:49:24 -0500 (0:00:00.223) 0:13:03.682 ******** ok: [managed-node2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Friday 30 January 2026 18:49:24 -0500 (0:00:00.185) 0:13:03.868 ******** ok: [managed-node2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Friday 30 January 2026 18:49:25 -0500 (0:00:00.575) 0:13:04.444 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Friday 30 January 2026 18:49:25 -0500 (0:00:00.145) 0:13:04.589 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Friday 30 January 2026 18:49:25 -0500 (0:00:00.111) 0:13:04.700 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Friday 30 January 2026 18:49:25 -0500 (0:00:00.574) 0:13:05.275 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Friday 30 January 2026 18:49:26 -0500 (0:00:00.279) 0:13:05.554 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Friday 30 January 2026 18:49:26 -0500 (0:00:00.286) 0:13:05.841 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Friday 30 January 2026 18:49:26 -0500 (0:00:00.255) 0:13:06.097 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Friday 30 January 2026 18:49:27 -0500 (0:00:00.234) 0:13:06.332 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Friday 30 January 2026 18:49:27 -0500 (0:00:00.268) 0:13:06.600 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Friday 30 January 2026 18:49:27 -0500 (0:00:00.175) 0:13:06.775 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Friday 30 January 2026 18:49:27 -0500 (0:00:00.144) 0:13:06.920 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Remove the key file] ***************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:358 Friday 30 January 2026 18:49:27 -0500 (0:00:00.210) 0:13:07.131 ******** ok: [managed-node2] => { "changed": false, "path": "/tmp/storage_test4gmic4o6lukskey", "state": "absent" } TASK [Test for correct handling of new encrypted volume w/ no key - 3] ********* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:368 Friday 30 January 2026 18:49:29 -0500 (0:00:01.634) 0:13:08.765 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node2 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Friday 30 January 2026 18:49:29 -0500 (0:00:00.253) 0:13:09.019 ******** ok: [managed-node2] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Friday 30 January 2026 18:49:29 -0500 (0:00:00.193) 0:13:09.212 ******** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Friday 30 January 2026 18:49:30 -0500 (0:00:00.209) 0:13:09.421 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Friday 30 January 2026 18:49:30 -0500 (0:00:00.384) 0:13:09.806 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Friday 30 January 2026 18:49:30 -0500 (0:00:00.395) 0:13:10.202 ******** skipping: [managed-node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node2] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node2] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Friday 30 January 2026 18:49:31 -0500 (0:00:00.506) 0:13:10.708 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Friday 30 January 2026 18:49:31 -0500 (0:00:00.292) 0:13:11.000 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Friday 30 January 2026 18:49:31 -0500 (0:00:00.204) 0:13:11.205 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Friday 30 January 2026 18:49:32 -0500 (0:00:00.200) 0:13:11.406 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Friday 30 January 2026 18:49:32 -0500 (0:00:00.166) 0:13:11.572 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Friday 30 January 2026 18:49:32 -0500 (0:00:00.425) 0:13:11.997 ******** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Friday 30 January 2026 18:49:37 -0500 (0:00:04.330) 0:13:16.327 ******** ok: [managed-node2] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": true, "encryption_luks_version": "luks2", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Friday 30 January 2026 18:49:37 -0500 (0:00:00.219) 0:13:16.547 ******** ok: [managed-node2] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Friday 30 January 2026 18:49:37 -0500 (0:00:00.244) 0:13:16.791 ******** ok: [managed-node2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup", "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Friday 30 January 2026 18:49:42 -0500 (0:00:05.349) 0:13:22.140 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Friday 30 January 2026 18:49:43 -0500 (0:00:00.271) 0:13:22.411 ******** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Friday 30 January 2026 18:49:43 -0500 (0:00:00.123) 0:13:22.535 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Friday 30 January 2026 18:49:43 -0500 (0:00:00.151) 0:13:22.686 ******** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Friday 30 January 2026 18:49:43 -0500 (0:00:00.088) 0:13:22.774 ******** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Friday 30 January 2026 18:49:47 -0500 (0:00:04.027) 0:13:26.802 ******** ok: [managed-node2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Friday 30 January 2026 18:49:50 -0500 (0:00:02.763) 0:13:29.565 ******** ok: [managed-node2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Friday 30 January 2026 18:49:50 -0500 (0:00:00.256) 0:13:29.822 ******** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Friday 30 January 2026 18:49:50 -0500 (0:00:00.155) 0:13:29.978 ******** fatal: [managed-node2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: encrypted volume 'test1' missing key/password TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111 Friday 30 January 2026 18:49:56 -0500 (0:00:05.783) 0:13:35.762 ******** fatal: [managed-node2]: FAILED! => { "changed": false } MSG: {'msg': "encrypted volume 'test1' missing key/password", 'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'invocation': {'module_args': {'pools': [{'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'lvm', 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None}]}], 'volumes': [], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': False, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Friday 30 January 2026 18:49:56 -0500 (0:00:00.348) 0:13:36.110 ******** TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:23 Friday 30 January 2026 18:49:57 -0500 (0:00:00.182) 0:13:36.292 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:28 Friday 30 January 2026 18:49:57 -0500 (0:00:00.273) 0:13:36.566 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:39 Friday 30 January 2026 18:49:57 -0500 (0:00:00.283) 0:13:36.849 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Create an encrypted lvm volume w/ default fs] **************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:387 Friday 30 January 2026 18:49:57 -0500 (0:00:00.316) 0:13:37.166 ******** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Friday 30 January 2026 18:49:58 -0500 (0:00:00.365) 0:13:37.532 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Friday 30 January 2026 18:49:58 -0500 (0:00:00.276) 0:13:37.809 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Friday 30 January 2026 18:49:58 -0500 (0:00:00.210) 0:13:38.020 ******** skipping: [managed-node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node2] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node2] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Friday 30 January 2026 18:49:59 -0500 (0:00:00.529) 0:13:38.549 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Friday 30 January 2026 18:49:59 -0500 (0:00:00.246) 0:13:38.795 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Friday 30 January 2026 18:49:59 -0500 (0:00:00.249) 0:13:39.045 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Friday 30 January 2026 18:49:59 -0500 (0:00:00.191) 0:13:39.236 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Friday 30 January 2026 18:50:00 -0500 (0:00:00.219) 0:13:39.456 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Friday 30 January 2026 18:50:00 -0500 (0:00:00.429) 0:13:39.885 ******** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Friday 30 January 2026 18:50:05 -0500 (0:00:04.976) 0:13:44.862 ******** ok: [managed-node2] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": true, "encryption_cipher": "aes-xts-plain64", "encryption_key_size": 512, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Friday 30 January 2026 18:50:05 -0500 (0:00:00.365) 0:13:45.227 ******** ok: [managed-node2] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Friday 30 January 2026 18:50:06 -0500 (0:00:00.342) 0:13:45.570 ******** ok: [managed-node2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup", "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Friday 30 January 2026 18:50:12 -0500 (0:00:05.811) 0:13:51.382 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Friday 30 January 2026 18:50:12 -0500 (0:00:00.324) 0:13:51.707 ******** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Friday 30 January 2026 18:50:12 -0500 (0:00:00.179) 0:13:51.886 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Friday 30 January 2026 18:50:12 -0500 (0:00:00.214) 0:13:52.101 ******** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Friday 30 January 2026 18:50:12 -0500 (0:00:00.112) 0:13:52.214 ******** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Friday 30 January 2026 18:50:17 -0500 (0:00:04.161) 0:13:56.375 ******** ok: [managed-node2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Friday 30 January 2026 18:50:19 -0500 (0:00:02.793) 0:13:59.169 ******** ok: [managed-node2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Friday 30 January 2026 18:50:20 -0500 (0:00:00.441) 0:13:59.610 ******** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Friday 30 January 2026 18:50:20 -0500 (0:00:00.252) 0:13:59.862 ******** changed: [managed-node2] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-ab99b751-7175-4a26-bc8f-cc52ceb754a6", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-ab99b751-7175-4a26-bc8f-cc52ceb754a6", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "destroy device", "device": "/dev/sda1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create format", "device": "/dev/sda", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/foo", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-c3cb5b2a-dd54-432f-b831-c342bb741a6d", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-c3cb5b2a-dd54-432f-b831-c342bb741a6d", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-ab99b751-7175-4a26-bc8f-cc52ceb754a6", "password": "-", "state": "absent" }, { "backing_device": "/dev/mapper/foo-test1", "name": "luks-c3cb5b2a-dd54-432f-b831-c342bb741a6d", "password": "-", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-c3cb5b2a-dd54-432f-b831-c342bb741a6d" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-ab99b751-7175-4a26-bc8f-cc52ceb754a6", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-c3cb5b2a-dd54-432f-b831-c342bb741a6d", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-c3cb5b2a-dd54-432f-b831-c342bb741a6d", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-c3cb5b2a-dd54-432f-b831-c342bb741a6d", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": "aes-xts-plain64", "encryption_key": null, "encryption_key_size": 512, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 Friday 30 January 2026 18:50:34 -0500 (0:00:14.419) 0:14:14.282 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Friday 30 January 2026 18:50:35 -0500 (0:00:00.334) 0:14:14.616 ******** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1769816912.6881216, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "71a12d5a218091af791ee526d4c544b6e4f333e8", "ctime": 1769816912.6851215, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 322961545, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1769816912.6851215, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1425, "uid": 0, "version": "3166550135", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:97 Friday 30 January 2026 18:50:37 -0500 (0:00:01.679) 0:14:16.296 ******** ok: [managed-node2] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Friday 30 January 2026 18:50:38 -0500 (0:00:01.927) 0:14:18.223 ******** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Friday 30 January 2026 18:50:39 -0500 (0:00:00.230) 0:14:18.454 ******** ok: [managed-node2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-ab99b751-7175-4a26-bc8f-cc52ceb754a6", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-ab99b751-7175-4a26-bc8f-cc52ceb754a6", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "destroy device", "device": "/dev/sda1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create format", "device": "/dev/sda", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/foo", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-c3cb5b2a-dd54-432f-b831-c342bb741a6d", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-c3cb5b2a-dd54-432f-b831-c342bb741a6d", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-ab99b751-7175-4a26-bc8f-cc52ceb754a6", "password": "-", "state": "absent" }, { "backing_device": "/dev/mapper/foo-test1", "name": "luks-c3cb5b2a-dd54-432f-b831-c342bb741a6d", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-c3cb5b2a-dd54-432f-b831-c342bb741a6d" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-ab99b751-7175-4a26-bc8f-cc52ceb754a6", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-c3cb5b2a-dd54-432f-b831-c342bb741a6d", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-c3cb5b2a-dd54-432f-b831-c342bb741a6d", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-c3cb5b2a-dd54-432f-b831-c342bb741a6d", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": "aes-xts-plain64", "encryption_key": null, "encryption_key_size": 512, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:130 Friday 30 January 2026 18:50:39 -0500 (0:00:00.234) 0:14:18.688 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-c3cb5b2a-dd54-432f-b831-c342bb741a6d", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-c3cb5b2a-dd54-432f-b831-c342bb741a6d", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": "aes-xts-plain64", "encryption_key": null, "encryption_key_size": 512, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:134 Friday 30 January 2026 18:50:39 -0500 (0:00:00.346) 0:14:19.035 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:150 Friday 30 January 2026 18:50:40 -0500 (0:00:00.256) 0:14:19.291 ******** changed: [managed-node2] => (item={'src': '/dev/mapper/luks-ab99b751-7175-4a26-bc8f-cc52ceb754a6', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-ab99b751-7175-4a26-bc8f-cc52ceb754a6", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-ab99b751-7175-4a26-bc8f-cc52ceb754a6" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Friday 30 January 2026 18:50:41 -0500 (0:00:01.619) 0:14:20.911 ******** ok: [managed-node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166 Friday 30 January 2026 18:50:43 -0500 (0:00:02.011) 0:14:22.922 ******** changed: [managed-node2] => (item={'src': '/dev/mapper/luks-c3cb5b2a-dd54-432f-b831-c342bb741a6d', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-c3cb5b2a-dd54-432f-b831-c342bb741a6d", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-c3cb5b2a-dd54-432f-b831-c342bb741a6d" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:177 Friday 30 January 2026 18:50:45 -0500 (0:00:01.532) 0:14:24.455 ******** skipping: [managed-node2] => (item={'src': '/dev/mapper/luks-c3cb5b2a-dd54-432f-b831-c342bb741a6d', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-c3cb5b2a-dd54-432f-b831-c342bb741a6d", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:189 Friday 30 January 2026 18:50:45 -0500 (0:00:00.418) 0:14:24.873 ******** ok: [managed-node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 Friday 30 January 2026 18:50:47 -0500 (0:00:02.032) 0:14:26.906 ******** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1769816925.1710904, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "e2f0db186f379a7cd2f4ac202d8494d032573709", "ctime": 1769816918.165108, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 232784132, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1769816918.1641078, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 85, "uid": 0, "version": "506883123", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:202 Friday 30 January 2026 18:50:49 -0500 (0:00:01.669) 0:14:28.576 ******** changed: [managed-node2] => (item={'backing_device': '/dev/sda1', 'name': 'luks-ab99b751-7175-4a26-bc8f-cc52ceb754a6', 'password': '-', 'state': 'absent'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda1", "name": "luks-ab99b751-7175-4a26-bc8f-cc52ceb754a6", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed changed: [managed-node2] => (item={'backing_device': '/dev/mapper/foo-test1', 'name': 'luks-c3cb5b2a-dd54-432f-b831-c342bb741a6d', 'password': '-', 'state': 'present'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/mapper/foo-test1", "name": "luks-c3cb5b2a-dd54-432f-b831-c342bb741a6d", "password": "-", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 Friday 30 January 2026 18:50:52 -0500 (0:00:03.050) 0:14:31.626 ******** ok: [managed-node2] TASK [Verify role results - 7] ************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:406 Friday 30 January 2026 18:50:54 -0500 (0:00:01.865) 0:14:33.492 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node2 TASK [Print out pool information] ********************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Friday 30 January 2026 18:50:54 -0500 (0:00:00.246) 0:14:33.739 ******** ok: [managed-node2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-c3cb5b2a-dd54-432f-b831-c342bb741a6d", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-c3cb5b2a-dd54-432f-b831-c342bb741a6d", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": "aes-xts-plain64", "encryption_key": null, "encryption_key_size": 512, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Friday 30 January 2026 18:50:54 -0500 (0:00:00.235) 0:14:33.975 ******** skipping: [managed-node2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Friday 30 January 2026 18:50:54 -0500 (0:00:00.193) 0:14:34.168 ******** ok: [managed-node2] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "c3cb5b2a-dd54-432f-b831-c342bb741a6d" }, "/dev/mapper/luks-c3cb5b2a-dd54-432f-b831-c342bb741a6d": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-c3cb5b2a-dd54-432f-b831-c342bb741a6d", "size": "4G", "type": "crypt", "uuid": "29b7e385-54d2-40be-8fa1-5114cc36992c" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "iWbLf2-pI6s-fAVq-vldH-x42y-4tNs-WFOSje" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Friday 30 January 2026 18:50:56 -0500 (0:00:01.556) 0:14:35.724 ******** ok: [managed-node2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002657", "end": "2026-01-30 18:50:57.287697", "rc": 0, "start": "2026-01-30 18:50:57.285040" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-c3cb5b2a-dd54-432f-b831-c342bb741a6d /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Friday 30 January 2026 18:50:57 -0500 (0:00:01.062) 0:14:36.787 ******** ok: [managed-node2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002916", "end": "2026-01-30 18:50:58.193062", "failed_when_result": false, "rc": 0, "start": "2026-01-30 18:50:58.190146" } STDOUT: luks-c3cb5b2a-dd54-432f-b831-c342bb741a6d /dev/mapper/foo-test1 - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Friday 30 January 2026 18:50:58 -0500 (0:00:00.821) 0:14:37.608 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Friday 30 January 2026 18:50:58 -0500 (0:00:00.270) 0:14:37.879 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Friday 30 January 2026 18:50:58 -0500 (0:00:00.130) 0:14:38.010 ******** ok: [managed-node2] => { "changed": false, "cmd": [ "vgs", "--noheadings", "--binary", "-o", "shared", "foo" ], "delta": "0:00:00.024471", "end": "2026-01-30 18:50:59.634351", "rc": 0, "start": "2026-01-30 18:50:59.609880" } STDOUT: 0 TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Friday 30 January 2026 18:50:59 -0500 (0:00:01.104) 0:14:39.114 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Friday 30 January 2026 18:51:00 -0500 (0:00:00.218) 0:14:39.332 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node2 included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node2 TASK [Set test variables] ****************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Friday 30 January 2026 18:51:00 -0500 (0:00:00.240) 0:14:39.572 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Friday 30 January 2026 18:51:00 -0500 (0:00:00.209) 0:14:39.782 ******** ok: [managed-node2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Friday 30 January 2026 18:51:03 -0500 (0:00:02.588) 0:14:42.371 ******** ok: [managed-node2] => { "ansible_facts": { "__pvs_lvm_len": "1" }, "changed": false } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Friday 30 January 2026 18:51:03 -0500 (0:00:00.322) 0:14:42.693 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "changed": false } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Friday 30 January 2026 18:51:03 -0500 (0:00:00.181) 0:14:42.875 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Set expected pv type] **************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Friday 30 January 2026 18:51:03 -0500 (0:00:00.320) 0:14:43.195 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type - 2] ************************************************ task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Friday 30 January 2026 18:51:04 -0500 (0:00:00.302) 0:14:43.497 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type - 3] ************************************************ task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Friday 30 January 2026 18:51:04 -0500 (0:00:00.227) 0:14:43.725 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:55 Friday 30 January 2026 18:51:04 -0500 (0:00:00.303) 0:14:44.029 ******** ok: [managed-node2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:68 Friday 30 January 2026 18:51:05 -0500 (0:00:00.389) 0:14:44.418 ******** ok: [managed-node2] => { "changed": false, "failed_when_result": false, "rc": 1 } STDERR: Shared connection to 10.31.47.227 closed. MSG: non-zero return code TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:78 Friday 30 January 2026 18:51:06 -0500 (0:00:01.489) 0:14:45.907 ******** skipping: [managed-node2] => (item=/dev/sda) => { "ansible_loop_var": "st_pool_pv", "changed": false, "skip_reason": "Conditional result was False", "st_pool_pv": "/dev/sda" } TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:88 Friday 30 January 2026 18:51:06 -0500 (0:00:00.223) 0:14:46.131 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node2 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Friday 30 January 2026 18:51:07 -0500 (0:00:00.455) 0:14:46.586 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Friday 30 January 2026 18:51:07 -0500 (0:00:00.149) 0:14:46.735 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Friday 30 January 2026 18:51:07 -0500 (0:00:00.148) 0:14:46.884 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Friday 30 January 2026 18:51:07 -0500 (0:00:00.321) 0:14:47.205 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Friday 30 January 2026 18:51:08 -0500 (0:00:00.634) 0:14:47.840 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Friday 30 January 2026 18:51:08 -0500 (0:00:00.236) 0:14:48.076 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Friday 30 January 2026 18:51:09 -0500 (0:00:00.246) 0:14:48.323 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Friday 30 January 2026 18:51:09 -0500 (0:00:00.222) 0:14:48.546 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Friday 30 January 2026 18:51:09 -0500 (0:00:00.142) 0:14:48.688 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Friday 30 January 2026 18:51:09 -0500 (0:00:00.189) 0:14:48.878 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Friday 30 January 2026 18:51:09 -0500 (0:00:00.278) 0:14:49.156 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:91 Friday 30 January 2026 18:51:10 -0500 (0:00:00.253) 0:14:49.409 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Friday 30 January 2026 18:51:10 -0500 (0:00:00.476) 0:14:49.886 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml for managed-node2 TASK [Get information about the LV] ******************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:8 Friday 30 January 2026 18:51:11 -0500 (0:00:00.516) 0:14:50.403 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:16 Friday 30 January 2026 18:51:11 -0500 (0:00:00.286) 0:14:50.689 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:20 Friday 30 January 2026 18:51:11 -0500 (0:00:00.277) 0:14:50.967 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV stripe size] ****************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:27 Friday 30 January 2026 18:51:12 -0500 (0:00:00.333) 0:14:51.300 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested stripe size] ***************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:31 Friday 30 January 2026 18:51:12 -0500 (0:00:00.181) 0:14:51.482 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected stripe size] ************************************************ task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:37 Friday 30 January 2026 18:51:12 -0500 (0:00:00.313) 0:14:51.796 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check stripe size] ******************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:42 Friday 30 January 2026 18:51:12 -0500 (0:00:00.410) 0:14:52.206 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:94 Friday 30 January 2026 18:51:13 -0500 (0:00:00.203) 0:14:52.410 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node2 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Friday 30 January 2026 18:51:13 -0500 (0:00:00.504) 0:14:52.914 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml for managed-node2 TASK [Get information about thinpool] ****************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:8 Friday 30 January 2026 18:51:14 -0500 (0:00:00.458) 0:14:53.372 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in correct thinpool (when thinp name is provided)] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:16 Friday 30 January 2026 18:51:14 -0500 (0:00:00.321) 0:14:53.693 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in thinpool (when thinp name is not provided)] ****** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:22 Friday 30 January 2026 18:51:14 -0500 (0:00:00.277) 0:14:53.971 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:26 Friday 30 January 2026 18:51:14 -0500 (0:00:00.243) 0:14:54.214 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_thin_status": null }, "changed": false } TASK [Check member encryption] ************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:97 Friday 30 January 2026 18:51:15 -0500 (0:00:00.220) 0:14:54.434 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node2 TASK [Set test variables] ****************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Friday 30 January 2026 18:51:15 -0500 (0:00:00.428) 0:14:54.863 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Friday 30 January 2026 18:51:15 -0500 (0:00:00.305) 0:14:55.168 ******** skipping: [managed-node2] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Friday 30 January 2026 18:51:16 -0500 (0:00:00.340) 0:14:55.509 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml for managed-node2 TASK [Set variables used by tests] ********************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:2 Friday 30 January 2026 18:51:16 -0500 (0:00:00.322) 0:14:55.832 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:6 Friday 30 January 2026 18:51:16 -0500 (0:00:00.211) 0:14:56.043 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:14 Friday 30 January 2026 18:51:16 -0500 (0:00:00.181) 0:14:56.225 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:23 Friday 30 January 2026 18:51:17 -0500 (0:00:00.176) 0:14:56.401 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:32 Friday 30 January 2026 18:51:17 -0500 (0:00:00.377) 0:14:56.778 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:41 Friday 30 January 2026 18:51:17 -0500 (0:00:00.237) 0:14:57.016 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [Clear test variables] **************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Friday 30 January 2026 18:51:18 -0500 (0:00:00.327) 0:14:57.344 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:100 Friday 30 January 2026 18:51:18 -0500 (0:00:00.237) 0:14:57.581 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Friday 30 January 2026 18:51:18 -0500 (0:00:00.596) 0:14:58.178 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml for managed-node2 TASK [Get information about VDO deduplication] ********************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:8 Friday 30 January 2026 18:51:19 -0500 (0:00:00.467) 0:14:58.645 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:15 Friday 30 January 2026 18:51:19 -0500 (0:00:00.355) 0:14:59.001 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:21 Friday 30 January 2026 18:51:19 -0500 (0:00:00.287) 0:14:59.288 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about VDO compression] *********************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:27 Friday 30 January 2026 18:51:20 -0500 (0:00:00.303) 0:14:59.592 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off - 2] *********************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:34 Friday 30 January 2026 18:51:20 -0500 (0:00:00.178) 0:14:59.770 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on - 2] ************************************ task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:40 Friday 30 January 2026 18:51:20 -0500 (0:00:00.214) 0:14:59.984 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:46 Friday 30 January 2026 18:51:20 -0500 (0:00:00.244) 0:15:00.228 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:103 Friday 30 January 2026 18:51:21 -0500 (0:00:00.229) 0:15:00.458 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node2 TASK [Get stratis pool information] ******************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Friday 30 January 2026 18:51:21 -0500 (0:00:00.459) 0:15:00.918 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print script output] ***************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Friday 30 January 2026 18:51:21 -0500 (0:00:00.219) 0:15:01.137 ******** skipping: [managed-node2] => {} TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:19 Friday 30 January 2026 18:51:22 -0500 (0:00:00.211) 0:15:01.348 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:23 Friday 30 January 2026 18:51:22 -0500 (0:00:00.147) 0:15:01.496 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:30 Friday 30 January 2026 18:51:22 -0500 (0:00:00.255) 0:15:01.751 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:39 Friday 30 January 2026 18:51:22 -0500 (0:00:00.145) 0:15:01.897 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:49 Friday 30 January 2026 18:51:22 -0500 (0:00:00.179) 0:15:02.077 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:106 Friday 30 January 2026 18:51:22 -0500 (0:00:00.173) 0:15:02.250 ******** ok: [managed-node2] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Friday 30 January 2026 18:51:23 -0500 (0:00:00.195) 0:15:02.446 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node2 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Friday 30 January 2026 18:51:23 -0500 (0:00:00.321) 0:15:02.768 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Friday 30 January 2026 18:51:23 -0500 (0:00:00.186) 0:15:02.954 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node2 included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node2 included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node2 included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node2 included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node2 included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node2 included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node2 included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Friday 30 January 2026 18:51:24 -0500 (0:00:01.038) 0:15:03.992 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-c3cb5b2a-dd54-432f-b831-c342bb741a6d" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Friday 30 January 2026 18:51:24 -0500 (0:00:00.184) 0:15:04.177 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Friday 30 January 2026 18:51:25 -0500 (0:00:00.346) 0:15:04.524 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Friday 30 January 2026 18:51:25 -0500 (0:00:00.447) 0:15:04.971 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Friday 30 January 2026 18:51:25 -0500 (0:00:00.201) 0:15:05.173 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Friday 30 January 2026 18:51:26 -0500 (0:00:00.251) 0:15:05.424 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Friday 30 January 2026 18:51:26 -0500 (0:00:00.205) 0:15:05.629 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Friday 30 January 2026 18:51:26 -0500 (0:00:00.228) 0:15:05.858 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Friday 30 January 2026 18:51:26 -0500 (0:00:00.186) 0:15:06.044 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Friday 30 January 2026 18:51:26 -0500 (0:00:00.229) 0:15:06.273 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Friday 30 January 2026 18:51:27 -0500 (0:00:00.182) 0:15:06.456 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Friday 30 January 2026 18:51:27 -0500 (0:00:00.208) 0:15:06.665 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-c3cb5b2a-dd54-432f-b831-c342bb741a6d " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Friday 30 January 2026 18:51:27 -0500 (0:00:00.471) 0:15:07.136 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Friday 30 January 2026 18:51:28 -0500 (0:00:00.322) 0:15:07.459 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Friday 30 January 2026 18:51:28 -0500 (0:00:00.286) 0:15:07.745 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Friday 30 January 2026 18:51:28 -0500 (0:00:00.250) 0:15:07.996 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Friday 30 January 2026 18:51:28 -0500 (0:00:00.207) 0:15:08.203 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Friday 30 January 2026 18:51:29 -0500 (0:00:00.184) 0:15:08.388 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Friday 30 January 2026 18:51:29 -0500 (0:00:00.283) 0:15:08.671 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Friday 30 January 2026 18:51:29 -0500 (0:00:00.355) 0:15:09.027 ******** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1769817034.4958184, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1769817034.4958184, "dev": 6, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 221074, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1769817034.4958184, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Friday 30 January 2026 18:51:30 -0500 (0:00:01.044) 0:15:10.072 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Friday 30 January 2026 18:51:31 -0500 (0:00:00.218) 0:15:10.290 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Friday 30 January 2026 18:51:31 -0500 (0:00:00.227) 0:15:10.518 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Friday 30 January 2026 18:51:31 -0500 (0:00:00.231) 0:15:10.749 ******** ok: [managed-node2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Friday 30 January 2026 18:51:31 -0500 (0:00:00.215) 0:15:10.964 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Friday 30 January 2026 18:51:31 -0500 (0:00:00.135) 0:15:11.100 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Friday 30 January 2026 18:51:31 -0500 (0:00:00.154) 0:15:11.254 ******** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1769817034.6438181, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1769817034.6438181, "dev": 6, "device_type": 64769, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 221183, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1769817034.6438181, "nlink": 1, "path": "/dev/mapper/luks-c3cb5b2a-dd54-432f-b831-c342bb741a6d", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Friday 30 January 2026 18:51:33 -0500 (0:00:01.209) 0:15:12.464 ******** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Friday 30 January 2026 18:51:36 -0500 (0:00:03.739) 0:15:16.204 ******** ok: [managed-node2] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/mapper/foo-test1" ], "delta": "0:00:00.010733", "end": "2026-01-30 18:51:37.956996", "rc": 0, "start": "2026-01-30 18:51:37.946263" } STDOUT: LUKS header information Version: 2 Epoch: 3 Metadata area: 16384 [bytes] Keyslots area: 16744448 [bytes] UUID: c3cb5b2a-dd54-432f-b831-c342bb741a6d Label: (no label) Subsystem: (no subsystem) Flags: (no flags) Data segments: 0: crypt offset: 16777216 [bytes] length: (whole device) cipher: aes-xts-plain64 sector: 512 [bytes] Keyslots: 0: luks2 Key: 512 bits Priority: normal Cipher: aes-xts-plain64 Cipher key: 512 bits PBKDF: argon2i Time cost: 4 Memory: 899029 Threads: 2 Salt: 39 25 01 42 d2 8e f0 14 9b 5d 07 57 8d 7e 63 1d 6d a9 18 8c 80 75 f2 63 cb de 03 ff 5d e5 06 db AF stripes: 4000 AF hash: sha256 Area offset:32768 [bytes] Area length:258048 [bytes] Digest ID: 0 Tokens: Digests: 0: pbkdf2 Hash: sha256 Iterations: 120249 Salt: 95 e7 75 4c c5 00 3f f1 f2 92 93 46 72 d4 3a 3b de 50 1f ee 02 28 f2 e5 e7 d0 55 df 6a 0f b9 1b Digest: 08 29 3a b5 c7 f1 6f 1e 25 75 d5 bc 64 a2 35 82 17 12 42 d0 43 71 60 c1 a1 fb 6b 05 16 6f 38 5a TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Friday 30 January 2026 18:51:38 -0500 (0:00:01.266) 0:15:17.470 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Friday 30 January 2026 18:51:38 -0500 (0:00:00.199) 0:15:17.670 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Friday 30 January 2026 18:51:38 -0500 (0:00:00.255) 0:15:17.925 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Friday 30 January 2026 18:51:38 -0500 (0:00:00.170) 0:15:18.096 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Friday 30 January 2026 18:51:38 -0500 (0:00:00.173) 0:15:18.270 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Friday 30 January 2026 18:51:39 -0500 (0:00:00.323) 0:15:18.594 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Friday 30 January 2026 18:51:39 -0500 (0:00:00.404) 0:15:18.998 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Set test variables] ****************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Friday 30 January 2026 18:51:39 -0500 (0:00:00.200) 0:15:19.198 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-c3cb5b2a-dd54-432f-b831-c342bb741a6d /dev/mapper/foo-test1 -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Friday 30 January 2026 18:51:40 -0500 (0:00:00.268) 0:15:19.466 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Friday 30 January 2026 18:51:40 -0500 (0:00:00.204) 0:15:19.671 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Friday 30 January 2026 18:51:40 -0500 (0:00:00.233) 0:15:19.904 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Friday 30 January 2026 18:51:40 -0500 (0:00:00.299) 0:15:20.203 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Friday 30 January 2026 18:51:41 -0500 (0:00:00.172) 0:15:20.375 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Friday 30 January 2026 18:51:41 -0500 (0:00:00.312) 0:15:20.688 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Friday 30 January 2026 18:51:41 -0500 (0:00:00.306) 0:15:20.994 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Friday 30 January 2026 18:51:41 -0500 (0:00:00.234) 0:15:21.229 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Friday 30 January 2026 18:51:42 -0500 (0:00:00.192) 0:15:21.421 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Friday 30 January 2026 18:51:42 -0500 (0:00:00.222) 0:15:21.644 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Friday 30 January 2026 18:51:42 -0500 (0:00:00.214) 0:15:21.859 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Friday 30 January 2026 18:51:42 -0500 (0:00:00.182) 0:15:22.042 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Friday 30 January 2026 18:51:42 -0500 (0:00:00.125) 0:15:22.167 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Friday 30 January 2026 18:51:43 -0500 (0:00:00.225) 0:15:22.393 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Friday 30 January 2026 18:51:43 -0500 (0:00:00.259) 0:15:22.653 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Friday 30 January 2026 18:51:43 -0500 (0:00:00.159) 0:15:22.812 ******** ok: [managed-node2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Friday 30 January 2026 18:51:46 -0500 (0:00:02.824) 0:15:25.637 ******** ok: [managed-node2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Friday 30 January 2026 18:51:47 -0500 (0:00:01.448) 0:15:27.085 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [Show expected size] ****************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Friday 30 January 2026 18:51:48 -0500 (0:00:00.260) 0:15:27.346 ******** ok: [managed-node2] => { "storage_test_expected_size": "4294967296" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Friday 30 January 2026 18:51:48 -0500 (0:00:00.267) 0:15:27.613 ******** ok: [managed-node2] => { "bytes": 10726680821, "changed": false, "lvm": "9g", "parted": "9GiB", "size": "9 GiB" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Friday 30 January 2026 18:51:49 -0500 (0:00:01.538) 0:15:29.151 ******** skipping: [managed-node2] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Friday 30 January 2026 18:51:50 -0500 (0:00:00.248) 0:15:29.399 ******** skipping: [managed-node2] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Friday 30 January 2026 18:51:50 -0500 (0:00:00.252) 0:15:29.652 ******** skipping: [managed-node2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Friday 30 January 2026 18:51:50 -0500 (0:00:00.399) 0:15:30.052 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Friday 30 January 2026 18:51:50 -0500 (0:00:00.219) 0:15:30.271 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Friday 30 January 2026 18:51:51 -0500 (0:00:00.252) 0:15:30.523 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Friday 30 January 2026 18:51:51 -0500 (0:00:00.279) 0:15:30.803 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Friday 30 January 2026 18:51:51 -0500 (0:00:00.252) 0:15:31.056 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Friday 30 January 2026 18:51:52 -0500 (0:00:00.253) 0:15:31.310 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Friday 30 January 2026 18:51:52 -0500 (0:00:00.297) 0:15:31.607 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Friday 30 January 2026 18:51:52 -0500 (0:00:00.169) 0:15:31.776 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Friday 30 January 2026 18:51:52 -0500 (0:00:00.203) 0:15:31.980 ******** skipping: [managed-node2] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Friday 30 January 2026 18:51:52 -0500 (0:00:00.171) 0:15:32.151 ******** skipping: [managed-node2] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Friday 30 January 2026 18:51:53 -0500 (0:00:00.307) 0:15:32.459 ******** skipping: [managed-node2] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Friday 30 January 2026 18:51:53 -0500 (0:00:00.382) 0:15:32.841 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Friday 30 January 2026 18:51:53 -0500 (0:00:00.217) 0:15:33.059 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Friday 30 January 2026 18:51:54 -0500 (0:00:00.256) 0:15:33.315 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Friday 30 January 2026 18:51:54 -0500 (0:00:00.214) 0:15:33.529 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Friday 30 January 2026 18:51:54 -0500 (0:00:00.232) 0:15:33.762 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Friday 30 January 2026 18:51:54 -0500 (0:00:00.165) 0:15:33.927 ******** ok: [managed-node2] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Friday 30 January 2026 18:51:54 -0500 (0:00:00.151) 0:15:34.078 ******** ok: [managed-node2] => { "storage_test_expected_size": "4294967296" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Friday 30 January 2026 18:51:55 -0500 (0:00:00.213) 0:15:34.292 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Friday 30 January 2026 18:51:55 -0500 (0:00:00.190) 0:15:34.482 ******** ok: [managed-node2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.024360", "end": "2026-01-30 18:51:56.566090", "rc": 0, "start": "2026-01-30 18:51:56.541730" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Friday 30 January 2026 18:51:56 -0500 (0:00:01.602) 0:15:36.084 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [Check segment type] ****************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Friday 30 January 2026 18:51:56 -0500 (0:00:00.186) 0:15:36.270 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Friday 30 January 2026 18:51:57 -0500 (0:00:00.490) 0:15:36.761 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Friday 30 January 2026 18:51:57 -0500 (0:00:00.250) 0:15:37.012 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Friday 30 January 2026 18:51:58 -0500 (0:00:00.308) 0:15:37.320 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Friday 30 January 2026 18:51:58 -0500 (0:00:00.296) 0:15:37.617 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Friday 30 January 2026 18:51:58 -0500 (0:00:00.240) 0:15:37.858 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Friday 30 January 2026 18:51:58 -0500 (0:00:00.350) 0:15:38.209 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Friday 30 January 2026 18:51:59 -0500 (0:00:00.185) 0:15:38.394 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Verify preservation of encryption settings on existing LVM volume] ******* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:409 Friday 30 January 2026 18:51:59 -0500 (0:00:00.238) 0:15:38.633 ******** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Friday 30 January 2026 18:51:59 -0500 (0:00:00.457) 0:15:39.091 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Friday 30 January 2026 18:52:00 -0500 (0:00:00.423) 0:15:39.514 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Friday 30 January 2026 18:52:00 -0500 (0:00:00.192) 0:15:39.707 ******** skipping: [managed-node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node2] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node2] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Friday 30 January 2026 18:52:00 -0500 (0:00:00.558) 0:15:40.265 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Friday 30 January 2026 18:52:01 -0500 (0:00:00.340) 0:15:40.606 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Friday 30 January 2026 18:52:01 -0500 (0:00:00.264) 0:15:40.870 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Friday 30 January 2026 18:52:01 -0500 (0:00:00.220) 0:15:41.091 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Friday 30 January 2026 18:52:02 -0500 (0:00:00.234) 0:15:41.326 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Friday 30 January 2026 18:52:02 -0500 (0:00:00.623) 0:15:41.949 ******** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Friday 30 January 2026 18:52:06 -0500 (0:00:04.244) 0:15:46.194 ******** ok: [managed-node2] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Friday 30 January 2026 18:52:07 -0500 (0:00:00.280) 0:15:46.474 ******** ok: [managed-node2] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Friday 30 January 2026 18:52:07 -0500 (0:00:00.403) 0:15:46.878 ******** ok: [managed-node2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Friday 30 January 2026 18:52:13 -0500 (0:00:06.059) 0:15:52.937 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Friday 30 January 2026 18:52:13 -0500 (0:00:00.334) 0:15:53.272 ******** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Friday 30 January 2026 18:52:14 -0500 (0:00:00.180) 0:15:53.453 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Friday 30 January 2026 18:52:14 -0500 (0:00:00.290) 0:15:53.743 ******** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Friday 30 January 2026 18:52:14 -0500 (0:00:00.246) 0:15:53.990 ******** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Friday 30 January 2026 18:52:19 -0500 (0:00:04.382) 0:15:58.372 ******** ok: [managed-node2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luk...d7175\\x2d4a26\\x2dbc8f\\x2dcc52ceb754a6.service": { "name": "systemd-cryptsetup@luk...d7175\\x2d4a26\\x2dbc8f\\x2dcc52ceb754a6.service", "source": "systemd", "state": "inactive", "status": "generated" }, "systemd-cryptsetup@luks\\x2dab99b751\\x2d7175\\x2d4a26\\x2dbc8f\\x2dcc52ceb754a6.service": { "name": "systemd-cryptsetup@luks\\x2dab99b751\\x2d7175\\x2d4a26\\x2dbc8f\\x2dcc52ceb754a6.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Friday 30 January 2026 18:52:22 -0500 (0:00:03.673) 0:16:02.045 ******** ok: [managed-node2] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2dab99b751\\x2d7175\\x2d4a26\\x2dbc8f\\x2dcc52ceb754a6.service", "systemd-cryptsetup@luk...d7175\\x2d4a26\\x2dbc8f\\x2dcc52ceb754a6.service" ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Friday 30 January 2026 18:52:23 -0500 (0:00:00.378) 0:16:02.424 ******** changed: [managed-node2] => (item=systemd-cryptsetup@luks\x2dab99b751\x2d7175\x2d4a26\x2dbc8f\x2dcc52ceb754a6.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2dab99b751\\x2d7175\\x2d4a26\\x2dbc8f\\x2dcc52ceb754a6.service", "name": "systemd-cryptsetup@luks\\x2dab99b751\\x2d7175\\x2d4a26\\x2dbc8f\\x2dcc52ceb754a6.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "systemd-journald.socket -.mount dev-sda1.device cryptsetup-pre.target system-systemd\\x2dcryptsetup.slice tmp.mount", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "cryptsetup.target umount.target", "BindsTo": "dev-sda1.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-ab99b751-7175-4a26-bc8f-cc52ceb754a6", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-ab99b751-7175-4a26-bc8f-cc52ceb754a6 /dev/sda1 /tmp/storage_test4gmic4o6lukskey ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-ab99b751-7175-4a26-bc8f-cc52ceb754a6 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2dab99b751\\x2d7175\\x2d4a26\\x2dbc8f\\x2dcc52ceb754a6.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2dab99b751\\x2d7175\\x2d4a26\\x2dbc8f\\x2dcc52ceb754a6.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2dab99b751\\x2d7175\\x2d4a26\\x2dbc8f\\x2dcc52ceb754a6.service", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target", "Requires": "system-systemd\\x2dcryptsetup.slice -.mount", "RequiresMountsFor": "/tmp/storage_test4gmic4o6lukskey", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Fri 2026-01-30 18:50:47 EST", "StateChangeTimestampMonotonic": "2453316917", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "infinity", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "WantedBy": "dev-sda1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node2] => (item=systemd-cryptsetup@luk...d7175\x2d4a26\x2dbc8f\x2dcc52ceb754a6.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...d7175\\x2d4a26\\x2dbc8f\\x2dcc52ceb754a6.service", "name": "systemd-cryptsetup@luk...d7175\\x2d4a26\\x2dbc8f\\x2dcc52ceb754a6.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...d7175\\x2d4a26\\x2dbc8f\\x2dcc52ceb754a6.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...d7175\\x2d4a26\\x2dbc8f\\x2dcc52ceb754a6.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.NoSuchUnit \"Unit systemd-cryptsetup@luk...d7175\\x2d4a26\\x2dbc8f\\x2dcc52ceb754a6.service not found.\"", "LoadState": "not-found", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...d7175\\x2d4a26\\x2dbc8f\\x2dcc52ceb754a6.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Friday 30 January 2026 18:52:26 -0500 (0:00:03.272) 0:16:05.697 ******** ok: [managed-node2] => { "actions": [], "changed": false, "crypts": [], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-c3cb5b2a-dd54-432f-b831-c342bb741a6d" ], "mounts": [ { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-c3cb5b2a-dd54-432f-b831-c342bb741a6d", "state": "mounted" } ], "packages": [ "xfsprogs", "lvm2", "cryptsetup" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-c3cb5b2a-dd54-432f-b831-c342bb741a6d", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-c3cb5b2a-dd54-432f-b831-c342bb741a6d", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 Friday 30 January 2026 18:52:31 -0500 (0:00:05.364) 0:16:11.061 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Friday 30 January 2026 18:52:31 -0500 (0:00:00.196) 0:16:11.257 ******** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1769817044.8717926, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "e6372f652392f3f5edae66914a466d22ea65c4ce", "ctime": 1769817044.8687925, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 322961545, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1769817044.8687925, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1425, "uid": 0, "version": "3166550135", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:97 Friday 30 January 2026 18:52:33 -0500 (0:00:01.776) 0:16:13.034 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Friday 30 January 2026 18:52:34 -0500 (0:00:00.315) 0:16:13.349 ******** changed: [managed-node2] => (item=systemd-cryptsetup@luks\x2dab99b751\x2d7175\x2d4a26\x2dbc8f\x2dcc52ceb754a6.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2dab99b751\\x2d7175\\x2d4a26\\x2dbc8f\\x2dcc52ceb754a6.service", "name": "systemd-cryptsetup@luks\\x2dab99b751\\x2d7175\\x2d4a26\\x2dbc8f\\x2dcc52ceb754a6.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2dab99b751\\x2d7175\\x2d4a26\\x2dbc8f\\x2dcc52ceb754a6.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2dab99b751\\x2d7175\\x2d4a26\\x2dbc8f\\x2dcc52ceb754a6.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2dab99b751\\x2d7175\\x2d4a26\\x2dbc8f\\x2dcc52ceb754a6.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2dab99b751\\x2d7175\\x2d4a26\\x2dbc8f\\x2dcc52ceb754a6.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node2] => (item=systemd-cryptsetup@luk...d7175\x2d4a26\x2dbc8f\x2dcc52ceb754a6.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...d7175\\x2d4a26\\x2dbc8f\\x2dcc52ceb754a6.service", "name": "systemd-cryptsetup@luk...d7175\\x2d4a26\\x2dbc8f\\x2dcc52ceb754a6.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...d7175\\x2d4a26\\x2dbc8f\\x2dcc52ceb754a6.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...d7175\\x2d4a26\\x2dbc8f\\x2dcc52ceb754a6.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luk...d7175\\x2d4a26\\x2dbc8f\\x2dcc52ceb754a6.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...d7175\\x2d4a26\\x2dbc8f\\x2dcc52ceb754a6.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Friday 30 January 2026 18:52:37 -0500 (0:00:03.709) 0:16:17.059 ******** ok: [managed-node2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-c3cb5b2a-dd54-432f-b831-c342bb741a6d" ], "mounts": [ { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-c3cb5b2a-dd54-432f-b831-c342bb741a6d", "state": "mounted" } ], "packages": [ "xfsprogs", "lvm2", "cryptsetup" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-c3cb5b2a-dd54-432f-b831-c342bb741a6d", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-c3cb5b2a-dd54-432f-b831-c342bb741a6d", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:130 Friday 30 January 2026 18:52:38 -0500 (0:00:00.342) 0:16:17.401 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-c3cb5b2a-dd54-432f-b831-c342bb741a6d", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-c3cb5b2a-dd54-432f-b831-c342bb741a6d", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:134 Friday 30 January 2026 18:52:38 -0500 (0:00:00.283) 0:16:17.685 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:150 Friday 30 January 2026 18:52:38 -0500 (0:00:00.322) 0:16:18.008 ******** TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Friday 30 January 2026 18:52:38 -0500 (0:00:00.277) 0:16:18.285 ******** ok: [managed-node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166 Friday 30 January 2026 18:52:40 -0500 (0:00:01.851) 0:16:20.137 ******** ok: [managed-node2] => (item={'src': '/dev/mapper/luks-c3cb5b2a-dd54-432f-b831-c342bb741a6d', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-c3cb5b2a-dd54-432f-b831-c342bb741a6d", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-c3cb5b2a-dd54-432f-b831-c342bb741a6d" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:177 Friday 30 January 2026 18:52:42 -0500 (0:00:01.734) 0:16:21.872 ******** skipping: [managed-node2] => (item={'src': '/dev/mapper/luks-c3cb5b2a-dd54-432f-b831-c342bb741a6d', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-c3cb5b2a-dd54-432f-b831-c342bb741a6d", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:189 Friday 30 January 2026 18:52:42 -0500 (0:00:00.384) 0:16:22.256 ******** ok: [managed-node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 Friday 30 January 2026 18:52:45 -0500 (0:00:02.086) 0:16:24.342 ******** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1769817058.1927595, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "d88a458f6ec2ff311d8d233aaa04382122634008", "ctime": 1769817052.1627746, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 354418821, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1769817052.1617744, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 66, "uid": 0, "version": "1583496756", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:202 Friday 30 January 2026 18:52:46 -0500 (0:00:01.765) 0:16:26.108 ******** TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 Friday 30 January 2026 18:52:47 -0500 (0:00:00.215) 0:16:26.323 ******** ok: [managed-node2] TASK [Assert preservation of encryption settings on existing LVM volume] ******* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:423 Friday 30 January 2026 18:52:49 -0500 (0:00:02.053) 0:16:28.376 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify role results - 8] ************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:430 Friday 30 January 2026 18:52:49 -0500 (0:00:00.326) 0:16:28.703 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node2 TASK [Print out pool information] ********************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Friday 30 January 2026 18:52:49 -0500 (0:00:00.460) 0:16:29.163 ******** ok: [managed-node2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-c3cb5b2a-dd54-432f-b831-c342bb741a6d", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-c3cb5b2a-dd54-432f-b831-c342bb741a6d", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Friday 30 January 2026 18:52:50 -0500 (0:00:00.277) 0:16:29.440 ******** skipping: [managed-node2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Friday 30 January 2026 18:52:50 -0500 (0:00:00.367) 0:16:29.808 ******** ok: [managed-node2] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "c3cb5b2a-dd54-432f-b831-c342bb741a6d" }, "/dev/mapper/luks-c3cb5b2a-dd54-432f-b831-c342bb741a6d": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-c3cb5b2a-dd54-432f-b831-c342bb741a6d", "size": "4G", "type": "crypt", "uuid": "29b7e385-54d2-40be-8fa1-5114cc36992c" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "iWbLf2-pI6s-fAVq-vldH-x42y-4tNs-WFOSje" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Friday 30 January 2026 18:52:52 -0500 (0:00:01.741) 0:16:31.550 ******** ok: [managed-node2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002606", "end": "2026-01-30 18:52:53.539314", "rc": 0, "start": "2026-01-30 18:52:53.536708" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-c3cb5b2a-dd54-432f-b831-c342bb741a6d /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Friday 30 January 2026 18:52:53 -0500 (0:00:01.614) 0:16:33.164 ******** ok: [managed-node2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002539", "end": "2026-01-30 18:52:55.425281", "failed_when_result": false, "rc": 0, "start": "2026-01-30 18:52:55.422742" } STDOUT: luks-c3cb5b2a-dd54-432f-b831-c342bb741a6d /dev/mapper/foo-test1 - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Friday 30 January 2026 18:52:55 -0500 (0:00:01.851) 0:16:35.016 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Friday 30 January 2026 18:52:56 -0500 (0:00:00.456) 0:16:35.472 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Friday 30 January 2026 18:52:56 -0500 (0:00:00.125) 0:16:35.597 ******** ok: [managed-node2] => { "changed": false, "cmd": [ "vgs", "--noheadings", "--binary", "-o", "shared", "foo" ], "delta": "0:00:00.025034", "end": "2026-01-30 18:52:57.739286", "rc": 0, "start": "2026-01-30 18:52:57.714252" } STDOUT: 0 TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Friday 30 January 2026 18:52:57 -0500 (0:00:01.617) 0:16:37.214 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Friday 30 January 2026 18:52:58 -0500 (0:00:00.285) 0:16:37.500 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node2 included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node2 TASK [Set test variables] ****************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Friday 30 January 2026 18:52:58 -0500 (0:00:00.436) 0:16:37.936 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Friday 30 January 2026 18:52:59 -0500 (0:00:00.430) 0:16:38.366 ******** ok: [managed-node2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Friday 30 January 2026 18:53:00 -0500 (0:00:01.586) 0:16:39.953 ******** ok: [managed-node2] => { "ansible_facts": { "__pvs_lvm_len": "1" }, "changed": false } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Friday 30 January 2026 18:53:00 -0500 (0:00:00.317) 0:16:40.271 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "changed": false } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Friday 30 January 2026 18:53:01 -0500 (0:00:00.328) 0:16:40.600 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Set expected pv type] **************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Friday 30 January 2026 18:53:01 -0500 (0:00:00.461) 0:16:41.061 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type - 2] ************************************************ task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Friday 30 January 2026 18:53:02 -0500 (0:00:00.309) 0:16:41.371 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type - 3] ************************************************ task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Friday 30 January 2026 18:53:02 -0500 (0:00:00.333) 0:16:41.704 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:55 Friday 30 January 2026 18:53:02 -0500 (0:00:00.291) 0:16:41.996 ******** ok: [managed-node2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:68 Friday 30 January 2026 18:53:03 -0500 (0:00:00.400) 0:16:42.397 ******** ok: [managed-node2] => { "changed": false, "failed_when_result": false, "rc": 1 } STDERR: Shared connection to 10.31.47.227 closed. MSG: non-zero return code TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:78 Friday 30 January 2026 18:53:04 -0500 (0:00:01.314) 0:16:43.711 ******** skipping: [managed-node2] => (item=/dev/sda) => { "ansible_loop_var": "st_pool_pv", "changed": false, "skip_reason": "Conditional result was False", "st_pool_pv": "/dev/sda" } TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:88 Friday 30 January 2026 18:53:04 -0500 (0:00:00.222) 0:16:43.934 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node2 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Friday 30 January 2026 18:53:05 -0500 (0:00:00.949) 0:16:44.883 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Friday 30 January 2026 18:53:05 -0500 (0:00:00.198) 0:16:45.082 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Friday 30 January 2026 18:53:06 -0500 (0:00:00.245) 0:16:45.327 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Friday 30 January 2026 18:53:06 -0500 (0:00:00.428) 0:16:45.756 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Friday 30 January 2026 18:53:06 -0500 (0:00:00.261) 0:16:46.018 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Friday 30 January 2026 18:53:07 -0500 (0:00:00.274) 0:16:46.292 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Friday 30 January 2026 18:53:07 -0500 (0:00:00.209) 0:16:46.502 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Friday 30 January 2026 18:53:07 -0500 (0:00:00.273) 0:16:46.775 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Friday 30 January 2026 18:53:07 -0500 (0:00:00.448) 0:16:47.223 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Friday 30 January 2026 18:53:08 -0500 (0:00:00.272) 0:16:47.496 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Friday 30 January 2026 18:53:08 -0500 (0:00:00.292) 0:16:47.788 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:91 Friday 30 January 2026 18:53:08 -0500 (0:00:00.223) 0:16:48.011 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Friday 30 January 2026 18:53:09 -0500 (0:00:00.519) 0:16:48.531 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml for managed-node2 TASK [Get information about the LV] ******************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:8 Friday 30 January 2026 18:53:09 -0500 (0:00:00.438) 0:16:48.970 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:16 Friday 30 January 2026 18:53:09 -0500 (0:00:00.260) 0:16:49.230 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:20 Friday 30 January 2026 18:53:10 -0500 (0:00:00.232) 0:16:49.463 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV stripe size] ****************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:27 Friday 30 January 2026 18:53:10 -0500 (0:00:00.177) 0:16:49.640 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested stripe size] ***************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:31 Friday 30 January 2026 18:53:10 -0500 (0:00:00.280) 0:16:49.921 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected stripe size] ************************************************ task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:37 Friday 30 January 2026 18:53:10 -0500 (0:00:00.352) 0:16:50.273 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check stripe size] ******************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:42 Friday 30 January 2026 18:53:11 -0500 (0:00:00.261) 0:16:50.535 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:94 Friday 30 January 2026 18:53:11 -0500 (0:00:00.172) 0:16:50.707 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node2 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Friday 30 January 2026 18:53:11 -0500 (0:00:00.495) 0:16:51.203 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml for managed-node2 TASK [Get information about thinpool] ****************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:8 Friday 30 January 2026 18:53:12 -0500 (0:00:00.365) 0:16:51.569 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in correct thinpool (when thinp name is provided)] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:16 Friday 30 January 2026 18:53:12 -0500 (0:00:00.170) 0:16:51.739 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in thinpool (when thinp name is not provided)] ****** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:22 Friday 30 January 2026 18:53:12 -0500 (0:00:00.203) 0:16:51.943 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:26 Friday 30 January 2026 18:53:12 -0500 (0:00:00.287) 0:16:52.230 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_thin_status": null }, "changed": false } TASK [Check member encryption] ************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:97 Friday 30 January 2026 18:53:13 -0500 (0:00:00.207) 0:16:52.437 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node2 TASK [Set test variables] ****************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Friday 30 January 2026 18:53:13 -0500 (0:00:00.583) 0:16:53.021 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Friday 30 January 2026 18:53:14 -0500 (0:00:00.340) 0:16:53.361 ******** skipping: [managed-node2] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Friday 30 January 2026 18:53:14 -0500 (0:00:00.270) 0:16:53.632 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml for managed-node2 TASK [Set variables used by tests] ********************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:2 Friday 30 January 2026 18:53:14 -0500 (0:00:00.396) 0:16:54.028 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:6 Friday 30 January 2026 18:53:15 -0500 (0:00:00.298) 0:16:54.327 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:14 Friday 30 January 2026 18:53:15 -0500 (0:00:00.301) 0:16:54.629 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:23 Friday 30 January 2026 18:53:15 -0500 (0:00:00.223) 0:16:54.852 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:32 Friday 30 January 2026 18:53:15 -0500 (0:00:00.310) 0:16:55.162 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:41 Friday 30 January 2026 18:53:16 -0500 (0:00:00.269) 0:16:55.432 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [Clear test variables] **************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Friday 30 January 2026 18:53:16 -0500 (0:00:00.124) 0:16:55.556 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:100 Friday 30 January 2026 18:53:16 -0500 (0:00:00.171) 0:16:55.728 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Friday 30 January 2026 18:53:16 -0500 (0:00:00.472) 0:16:56.200 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml for managed-node2 TASK [Get information about VDO deduplication] ********************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:8 Friday 30 January 2026 18:53:17 -0500 (0:00:00.391) 0:16:56.592 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:15 Friday 30 January 2026 18:53:17 -0500 (0:00:00.267) 0:16:56.859 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:21 Friday 30 January 2026 18:53:17 -0500 (0:00:00.266) 0:16:57.126 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about VDO compression] *********************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:27 Friday 30 January 2026 18:53:18 -0500 (0:00:00.285) 0:16:57.411 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off - 2] *********************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:34 Friday 30 January 2026 18:53:18 -0500 (0:00:00.238) 0:16:57.649 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on - 2] ************************************ task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:40 Friday 30 January 2026 18:53:18 -0500 (0:00:00.228) 0:16:57.878 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:46 Friday 30 January 2026 18:53:18 -0500 (0:00:00.221) 0:16:58.100 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:103 Friday 30 January 2026 18:53:18 -0500 (0:00:00.158) 0:16:58.258 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node2 TASK [Get stratis pool information] ******************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Friday 30 January 2026 18:53:19 -0500 (0:00:00.586) 0:16:58.844 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print script output] ***************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Friday 30 January 2026 18:53:19 -0500 (0:00:00.295) 0:16:59.140 ******** skipping: [managed-node2] => {} TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:19 Friday 30 January 2026 18:53:20 -0500 (0:00:00.248) 0:16:59.388 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:23 Friday 30 January 2026 18:53:20 -0500 (0:00:00.216) 0:16:59.605 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:30 Friday 30 January 2026 18:53:20 -0500 (0:00:00.200) 0:16:59.806 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:39 Friday 30 January 2026 18:53:20 -0500 (0:00:00.211) 0:17:00.018 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:49 Friday 30 January 2026 18:53:20 -0500 (0:00:00.210) 0:17:00.229 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:106 Friday 30 January 2026 18:53:21 -0500 (0:00:00.271) 0:17:00.500 ******** ok: [managed-node2] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Friday 30 January 2026 18:53:21 -0500 (0:00:00.226) 0:17:00.727 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node2 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Friday 30 January 2026 18:53:21 -0500 (0:00:00.372) 0:17:01.099 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Friday 30 January 2026 18:53:22 -0500 (0:00:00.200) 0:17:01.300 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node2 included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node2 included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node2 included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node2 included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node2 included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node2 included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node2 included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Friday 30 January 2026 18:53:22 -0500 (0:00:00.672) 0:17:01.972 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-c3cb5b2a-dd54-432f-b831-c342bb741a6d" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Friday 30 January 2026 18:53:23 -0500 (0:00:00.492) 0:17:02.464 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Friday 30 January 2026 18:53:23 -0500 (0:00:00.242) 0:17:02.707 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Friday 30 January 2026 18:53:23 -0500 (0:00:00.288) 0:17:02.995 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Friday 30 January 2026 18:53:24 -0500 (0:00:00.787) 0:17:03.782 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Friday 30 January 2026 18:53:24 -0500 (0:00:00.255) 0:17:04.038 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Friday 30 January 2026 18:53:25 -0500 (0:00:00.289) 0:17:04.327 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Friday 30 January 2026 18:53:25 -0500 (0:00:00.393) 0:17:04.720 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Friday 30 January 2026 18:53:25 -0500 (0:00:00.243) 0:17:04.963 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Friday 30 January 2026 18:53:25 -0500 (0:00:00.274) 0:17:05.238 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Friday 30 January 2026 18:53:26 -0500 (0:00:00.310) 0:17:05.548 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Friday 30 January 2026 18:53:26 -0500 (0:00:00.185) 0:17:05.734 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-c3cb5b2a-dd54-432f-b831-c342bb741a6d " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Friday 30 January 2026 18:53:27 -0500 (0:00:00.569) 0:17:06.304 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Friday 30 January 2026 18:53:27 -0500 (0:00:00.251) 0:17:06.556 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Friday 30 January 2026 18:53:27 -0500 (0:00:00.161) 0:17:06.717 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Friday 30 January 2026 18:53:27 -0500 (0:00:00.159) 0:17:06.877 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Friday 30 January 2026 18:53:27 -0500 (0:00:00.180) 0:17:07.058 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Friday 30 January 2026 18:53:27 -0500 (0:00:00.193) 0:17:07.251 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Friday 30 January 2026 18:53:28 -0500 (0:00:00.264) 0:17:07.516 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Friday 30 January 2026 18:53:28 -0500 (0:00:00.286) 0:17:07.802 ******** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1769817097.9516604, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1769817034.4958184, "dev": 6, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 221074, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1769817034.4958184, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Friday 30 January 2026 18:53:29 -0500 (0:00:01.247) 0:17:09.049 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Friday 30 January 2026 18:53:30 -0500 (0:00:00.316) 0:17:09.366 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Friday 30 January 2026 18:53:30 -0500 (0:00:00.251) 0:17:09.617 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Friday 30 January 2026 18:53:30 -0500 (0:00:00.185) 0:17:09.802 ******** ok: [managed-node2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Friday 30 January 2026 18:53:30 -0500 (0:00:00.173) 0:17:09.976 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Friday 30 January 2026 18:53:30 -0500 (0:00:00.245) 0:17:10.222 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Friday 30 January 2026 18:53:31 -0500 (0:00:00.138) 0:17:10.360 ******** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1769817151.4085274, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1769817034.6438181, "dev": 6, "device_type": 64769, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 221183, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1769817034.6438181, "nlink": 1, "path": "/dev/mapper/luks-c3cb5b2a-dd54-432f-b831-c342bb741a6d", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Friday 30 January 2026 18:53:32 -0500 (0:00:01.288) 0:17:11.648 ******** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Friday 30 January 2026 18:53:36 -0500 (0:00:04.049) 0:17:15.698 ******** ok: [managed-node2] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/mapper/foo-test1" ], "delta": "0:00:00.010451", "end": "2026-01-30 18:53:37.487626", "rc": 0, "start": "2026-01-30 18:53:37.477175" } STDOUT: LUKS header information Version: 2 Epoch: 3 Metadata area: 16384 [bytes] Keyslots area: 16744448 [bytes] UUID: c3cb5b2a-dd54-432f-b831-c342bb741a6d Label: (no label) Subsystem: (no subsystem) Flags: (no flags) Data segments: 0: crypt offset: 16777216 [bytes] length: (whole device) cipher: aes-xts-plain64 sector: 512 [bytes] Keyslots: 0: luks2 Key: 512 bits Priority: normal Cipher: aes-xts-plain64 Cipher key: 512 bits PBKDF: argon2i Time cost: 4 Memory: 899029 Threads: 2 Salt: 39 25 01 42 d2 8e f0 14 9b 5d 07 57 8d 7e 63 1d 6d a9 18 8c 80 75 f2 63 cb de 03 ff 5d e5 06 db AF stripes: 4000 AF hash: sha256 Area offset:32768 [bytes] Area length:258048 [bytes] Digest ID: 0 Tokens: Digests: 0: pbkdf2 Hash: sha256 Iterations: 120249 Salt: 95 e7 75 4c c5 00 3f f1 f2 92 93 46 72 d4 3a 3b de 50 1f ee 02 28 f2 e5 e7 d0 55 df 6a 0f b9 1b Digest: 08 29 3a b5 c7 f1 6f 1e 25 75 d5 bc 64 a2 35 82 17 12 42 d0 43 71 60 c1 a1 fb 6b 05 16 6f 38 5a TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Friday 30 January 2026 18:53:37 -0500 (0:00:01.291) 0:17:16.989 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Friday 30 January 2026 18:53:37 -0500 (0:00:00.231) 0:17:17.220 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Friday 30 January 2026 18:53:38 -0500 (0:00:00.229) 0:17:17.450 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Friday 30 January 2026 18:53:38 -0500 (0:00:00.208) 0:17:17.658 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Friday 30 January 2026 18:53:38 -0500 (0:00:00.173) 0:17:17.832 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Friday 30 January 2026 18:53:38 -0500 (0:00:00.184) 0:17:18.016 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Friday 30 January 2026 18:53:38 -0500 (0:00:00.223) 0:17:18.240 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Friday 30 January 2026 18:53:39 -0500 (0:00:00.164) 0:17:18.404 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-c3cb5b2a-dd54-432f-b831-c342bb741a6d /dev/mapper/foo-test1 -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Friday 30 January 2026 18:53:39 -0500 (0:00:00.255) 0:17:18.660 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Friday 30 January 2026 18:53:39 -0500 (0:00:00.186) 0:17:18.847 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Friday 30 January 2026 18:53:39 -0500 (0:00:00.284) 0:17:19.131 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Friday 30 January 2026 18:53:40 -0500 (0:00:00.192) 0:17:19.324 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Friday 30 January 2026 18:53:40 -0500 (0:00:00.292) 0:17:19.616 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Friday 30 January 2026 18:53:40 -0500 (0:00:00.177) 0:17:19.794 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Friday 30 January 2026 18:53:40 -0500 (0:00:00.164) 0:17:19.958 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Friday 30 January 2026 18:53:40 -0500 (0:00:00.220) 0:17:20.179 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Friday 30 January 2026 18:53:41 -0500 (0:00:00.141) 0:17:20.320 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Friday 30 January 2026 18:53:41 -0500 (0:00:00.161) 0:17:20.481 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Friday 30 January 2026 18:53:41 -0500 (0:00:00.189) 0:17:20.670 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Friday 30 January 2026 18:53:41 -0500 (0:00:00.236) 0:17:20.907 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Friday 30 January 2026 18:53:41 -0500 (0:00:00.192) 0:17:21.099 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Friday 30 January 2026 18:53:42 -0500 (0:00:00.231) 0:17:21.331 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Friday 30 January 2026 18:53:42 -0500 (0:00:00.269) 0:17:21.601 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Friday 30 January 2026 18:53:42 -0500 (0:00:00.138) 0:17:21.740 ******** ok: [managed-node2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Friday 30 January 2026 18:53:43 -0500 (0:00:01.350) 0:17:23.090 ******** ok: [managed-node2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Friday 30 January 2026 18:53:45 -0500 (0:00:01.472) 0:17:24.563 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [Show expected size] ****************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Friday 30 January 2026 18:53:45 -0500 (0:00:00.258) 0:17:24.821 ******** ok: [managed-node2] => { "storage_test_expected_size": "4294967296" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Friday 30 January 2026 18:53:45 -0500 (0:00:00.206) 0:17:25.027 ******** ok: [managed-node2] => { "bytes": 10726680821, "changed": false, "lvm": "9g", "parted": "9GiB", "size": "9 GiB" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Friday 30 January 2026 18:53:47 -0500 (0:00:01.425) 0:17:26.453 ******** skipping: [managed-node2] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Friday 30 January 2026 18:53:47 -0500 (0:00:00.227) 0:17:26.680 ******** skipping: [managed-node2] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Friday 30 January 2026 18:53:47 -0500 (0:00:00.323) 0:17:27.004 ******** skipping: [managed-node2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Friday 30 January 2026 18:53:47 -0500 (0:00:00.235) 0:17:27.239 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Friday 30 January 2026 18:53:48 -0500 (0:00:00.297) 0:17:27.537 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Friday 30 January 2026 18:53:48 -0500 (0:00:00.129) 0:17:27.666 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Friday 30 January 2026 18:53:48 -0500 (0:00:00.155) 0:17:27.822 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Friday 30 January 2026 18:53:48 -0500 (0:00:00.200) 0:17:28.022 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Friday 30 January 2026 18:53:48 -0500 (0:00:00.152) 0:17:28.175 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Friday 30 January 2026 18:53:49 -0500 (0:00:00.180) 0:17:28.356 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Friday 30 January 2026 18:53:49 -0500 (0:00:00.210) 0:17:28.567 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Friday 30 January 2026 18:53:49 -0500 (0:00:00.336) 0:17:28.903 ******** skipping: [managed-node2] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Friday 30 January 2026 18:53:49 -0500 (0:00:00.160) 0:17:29.064 ******** skipping: [managed-node2] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Friday 30 January 2026 18:53:49 -0500 (0:00:00.201) 0:17:29.266 ******** skipping: [managed-node2] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Friday 30 January 2026 18:53:50 -0500 (0:00:00.118) 0:17:29.385 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Friday 30 January 2026 18:53:50 -0500 (0:00:00.129) 0:17:29.514 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Friday 30 January 2026 18:53:50 -0500 (0:00:00.220) 0:17:29.734 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Friday 30 January 2026 18:53:50 -0500 (0:00:00.229) 0:17:29.964 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Friday 30 January 2026 18:53:50 -0500 (0:00:00.164) 0:17:30.129 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Friday 30 January 2026 18:53:50 -0500 (0:00:00.139) 0:17:30.268 ******** ok: [managed-node2] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Friday 30 January 2026 18:53:51 -0500 (0:00:00.193) 0:17:30.462 ******** ok: [managed-node2] => { "storage_test_expected_size": "4294967296" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Friday 30 January 2026 18:53:51 -0500 (0:00:00.212) 0:17:30.674 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Friday 30 January 2026 18:53:51 -0500 (0:00:00.314) 0:17:30.989 ******** ok: [managed-node2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.022847", "end": "2026-01-30 18:53:52.789055", "rc": 0, "start": "2026-01-30 18:53:52.766208" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Friday 30 January 2026 18:53:52 -0500 (0:00:01.281) 0:17:32.271 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [Check segment type] ****************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Friday 30 January 2026 18:53:53 -0500 (0:00:00.216) 0:17:32.488 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Friday 30 January 2026 18:53:53 -0500 (0:00:00.226) 0:17:32.714 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Friday 30 January 2026 18:53:53 -0500 (0:00:00.231) 0:17:32.946 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Friday 30 January 2026 18:53:53 -0500 (0:00:00.245) 0:17:33.191 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Friday 30 January 2026 18:53:54 -0500 (0:00:00.246) 0:17:33.438 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Friday 30 January 2026 18:53:54 -0500 (0:00:00.196) 0:17:33.635 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Friday 30 January 2026 18:53:54 -0500 (0:00:00.138) 0:17:33.774 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Friday 30 January 2026 18:53:54 -0500 (0:00:00.130) 0:17:33.904 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Friday 30 January 2026 18:53:54 -0500 (0:00:00.085) 0:17:33.990 ******** changed: [managed-node2] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode - 5] ****************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:436 Friday 30 January 2026 18:53:56 -0500 (0:00:01.469) 0:17:35.459 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node2 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Friday 30 January 2026 18:53:56 -0500 (0:00:00.363) 0:17:35.822 ******** ok: [managed-node2] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Friday 30 January 2026 18:53:56 -0500 (0:00:00.190) 0:17:36.012 ******** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Friday 30 January 2026 18:53:56 -0500 (0:00:00.150) 0:17:36.163 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Friday 30 January 2026 18:53:57 -0500 (0:00:00.305) 0:17:36.468 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Friday 30 January 2026 18:53:57 -0500 (0:00:00.250) 0:17:36.718 ******** skipping: [managed-node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node2] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node2] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Friday 30 January 2026 18:53:57 -0500 (0:00:00.336) 0:17:37.055 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Friday 30 January 2026 18:53:57 -0500 (0:00:00.110) 0:17:37.166 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Friday 30 January 2026 18:53:57 -0500 (0:00:00.120) 0:17:37.287 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Friday 30 January 2026 18:53:58 -0500 (0:00:00.040) 0:17:37.327 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Friday 30 January 2026 18:53:58 -0500 (0:00:00.130) 0:17:37.458 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Friday 30 January 2026 18:53:58 -0500 (0:00:00.368) 0:17:37.826 ******** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Friday 30 January 2026 18:54:03 -0500 (0:00:04.522) 0:17:42.349 ******** ok: [managed-node2] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": false, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Friday 30 January 2026 18:54:03 -0500 (0:00:00.187) 0:17:42.536 ******** ok: [managed-node2] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Friday 30 January 2026 18:54:03 -0500 (0:00:00.254) 0:17:42.791 ******** ok: [managed-node2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Friday 30 January 2026 18:54:08 -0500 (0:00:05.347) 0:17:48.138 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Friday 30 January 2026 18:54:09 -0500 (0:00:00.187) 0:17:48.325 ******** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Friday 30 January 2026 18:54:09 -0500 (0:00:00.083) 0:17:48.409 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Friday 30 January 2026 18:54:09 -0500 (0:00:00.154) 0:17:48.563 ******** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Friday 30 January 2026 18:54:09 -0500 (0:00:00.097) 0:17:48.661 ******** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Friday 30 January 2026 18:54:12 -0500 (0:00:03.540) 0:17:52.201 ******** ok: [managed-node2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luk...ddd54\\x2d432f\\x2db831\\x2dc342bb741a6d.service": { "name": "systemd-cryptsetup@luk...ddd54\\x2d432f\\x2db831\\x2dc342bb741a6d.service", "source": "systemd", "state": "inactive", "status": "generated" }, "systemd-cryptsetup@luks\\x2dc3cb5b2a\\x2ddd54\\x2d432f\\x2db831\\x2dc342bb741a6d.service": { "name": "systemd-cryptsetup@luks\\x2dc3cb5b2a\\x2ddd54\\x2d432f\\x2db831\\x2dc342bb741a6d.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Friday 30 January 2026 18:54:15 -0500 (0:00:02.730) 0:17:54.931 ******** ok: [managed-node2] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2dc3cb5b2a\\x2ddd54\\x2d432f\\x2db831\\x2dc342bb741a6d.service", "systemd-cryptsetup@luk...ddd54\\x2d432f\\x2db831\\x2dc342bb741a6d.service" ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Friday 30 January 2026 18:54:16 -0500 (0:00:00.701) 0:17:55.633 ******** changed: [managed-node2] => (item=systemd-cryptsetup@luks\x2dc3cb5b2a\x2ddd54\x2d432f\x2db831\x2dc342bb741a6d.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2dc3cb5b2a\\x2ddd54\\x2d432f\\x2db831\\x2dc342bb741a6d.service", "name": "systemd-cryptsetup@luks\\x2dc3cb5b2a\\x2ddd54\\x2d432f\\x2db831\\x2dc342bb741a6d.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "systemd-journald.socket dev-mapper-foo\\x2dtest1.device cryptsetup-pre.target system-systemd\\x2dcryptsetup.slice", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "umount.target cryptsetup.target", "BindsTo": "dev-mapper-foo\\x2dtest1.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-c3cb5b2a-dd54-432f-b831-c342bb741a6d", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-c3cb5b2a-dd54-432f-b831-c342bb741a6d /dev/mapper/foo-test1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-c3cb5b2a-dd54-432f-b831-c342bb741a6d ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2dc3cb5b2a\\x2ddd54\\x2d432f\\x2db831\\x2dc342bb741a6d.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2dc3cb5b2a\\x2ddd54\\x2d432f\\x2db831\\x2dc342bb741a6d.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2dc3cb5b2a\\x2ddd54\\x2d432f\\x2db831\\x2dc342bb741a6d.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "dev-mapper-luks\\x2dc3cb5b2a\\x2ddd54\\x2d432f\\x2db831\\x2dc342bb741a6d.device cryptsetup.target", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Fri 2026-01-30 18:52:26 EST", "StateChangeTimestampMonotonic": "2552175322", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "infinity", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "WantedBy": "dev-mapper-foo\\x2dtest1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node2] => (item=systemd-cryptsetup@luk...ddd54\x2d432f\x2db831\x2dc342bb741a6d.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...ddd54\\x2d432f\\x2db831\\x2dc342bb741a6d.service", "name": "systemd-cryptsetup@luk...ddd54\\x2d432f\\x2db831\\x2dc342bb741a6d.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...ddd54\\x2d432f\\x2db831\\x2dc342bb741a6d.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...ddd54\\x2d432f\\x2db831\\x2dc342bb741a6d.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.NoSuchUnit \"Unit systemd-cryptsetup@luk...ddd54\\x2d432f\\x2db831\\x2dc342bb741a6d.service not found.\"", "LoadState": "not-found", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...ddd54\\x2d432f\\x2db831\\x2dc342bb741a6d.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Friday 30 January 2026 18:54:19 -0500 (0:00:03.428) 0:17:59.062 ******** fatal: [managed-node2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'luks-c3cb5b2a-dd54-432f-b831-c342bb741a6d' in safe mode due to encryption removal TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111 Friday 30 January 2026 18:54:25 -0500 (0:00:05.640) 0:18:04.702 ******** fatal: [managed-node2]: FAILED! => { "changed": false } MSG: {'msg': "cannot remove existing formatting on device 'luks-c3cb5b2a-dd54-432f-b831-c342bb741a6d' in safe mode due to encryption removal", 'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'invocation': {'module_args': {'pools': [{'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'lvm', 'volumes': [{'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None}]}], 'volumes': [], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': True, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Friday 30 January 2026 18:54:25 -0500 (0:00:00.143) 0:18:04.846 ******** changed: [managed-node2] => (item=systemd-cryptsetup@luks\x2dc3cb5b2a\x2ddd54\x2d432f\x2db831\x2dc342bb741a6d.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2dc3cb5b2a\\x2ddd54\\x2d432f\\x2db831\\x2dc342bb741a6d.service", "name": "systemd-cryptsetup@luks\\x2dc3cb5b2a\\x2ddd54\\x2d432f\\x2db831\\x2dc342bb741a6d.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2dc3cb5b2a\\x2ddd54\\x2d432f\\x2db831\\x2dc342bb741a6d.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2dc3cb5b2a\\x2ddd54\\x2d432f\\x2db831\\x2dc342bb741a6d.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2dc3cb5b2a\\x2ddd54\\x2d432f\\x2db831\\x2dc342bb741a6d.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2dc3cb5b2a\\x2ddd54\\x2d432f\\x2db831\\x2dc342bb741a6d.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target dev-mapper-luks\\x2dc3cb5b2a\\x2ddd54\\x2d432f\\x2db831\\x2dc342bb741a6d.device", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Fri 2026-01-30 18:52:26 EST", "StateChangeTimestampMonotonic": "2552175322", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WantedBy": "dev-mapper-foo\\x2dtest1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node2] => (item=systemd-cryptsetup@luk...ddd54\x2d432f\x2db831\x2dc342bb741a6d.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...ddd54\\x2d432f\\x2db831\\x2dc342bb741a6d.service", "name": "systemd-cryptsetup@luk...ddd54\\x2d432f\\x2db831\\x2dc342bb741a6d.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...ddd54\\x2d432f\\x2db831\\x2dc342bb741a6d.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...ddd54\\x2d432f\\x2db831\\x2dc342bb741a6d.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luk...ddd54\\x2d432f\\x2db831\\x2dc342bb741a6d.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...ddd54\\x2d432f\\x2db831\\x2dc342bb741a6d.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:23 Friday 30 January 2026 18:54:28 -0500 (0:00:03.077) 0:18:07.924 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:28 Friday 30 January 2026 18:54:28 -0500 (0:00:00.207) 0:18:08.131 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:39 Friday 30 January 2026 18:54:29 -0500 (0:00:00.251) 0:18:08.382 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Friday 30 January 2026 18:54:29 -0500 (0:00:00.191) 0:18:08.574 ******** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1769817235.876317, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1769817235.876317, "dev": 64769, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1769817235.876317, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "3804743924", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Friday 30 January 2026 18:54:30 -0500 (0:00:01.178) 0:18:09.752 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Remove the encryption layer - 3] ***************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:460 Friday 30 January 2026 18:54:30 -0500 (0:00:00.187) 0:18:09.939 ******** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Friday 30 January 2026 18:54:31 -0500 (0:00:00.574) 0:18:10.514 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Friday 30 January 2026 18:54:31 -0500 (0:00:00.342) 0:18:10.856 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Friday 30 January 2026 18:54:31 -0500 (0:00:00.211) 0:18:11.068 ******** skipping: [managed-node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node2] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node2] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Friday 30 January 2026 18:54:32 -0500 (0:00:00.525) 0:18:11.594 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Friday 30 January 2026 18:54:32 -0500 (0:00:00.192) 0:18:11.786 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Friday 30 January 2026 18:54:32 -0500 (0:00:00.212) 0:18:11.999 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Friday 30 January 2026 18:54:32 -0500 (0:00:00.231) 0:18:12.231 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Friday 30 January 2026 18:54:33 -0500 (0:00:00.229) 0:18:12.461 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Friday 30 January 2026 18:54:33 -0500 (0:00:00.507) 0:18:12.968 ******** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Friday 30 January 2026 18:54:37 -0500 (0:00:04.164) 0:18:17.133 ******** ok: [managed-node2] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": false, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Friday 30 January 2026 18:54:38 -0500 (0:00:00.332) 0:18:17.465 ******** ok: [managed-node2] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Friday 30 January 2026 18:54:38 -0500 (0:00:00.241) 0:18:17.707 ******** ok: [managed-node2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Friday 30 January 2026 18:54:43 -0500 (0:00:05.355) 0:18:23.062 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Friday 30 January 2026 18:54:44 -0500 (0:00:00.438) 0:18:23.501 ******** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Friday 30 January 2026 18:54:44 -0500 (0:00:00.135) 0:18:23.637 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Friday 30 January 2026 18:54:44 -0500 (0:00:00.391) 0:18:24.029 ******** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Friday 30 January 2026 18:54:45 -0500 (0:00:00.269) 0:18:24.299 ******** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Friday 30 January 2026 18:54:49 -0500 (0:00:04.342) 0:18:28.641 ******** ok: [managed-node2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luk...ddd54\\x2d432f\\x2db831\\x2dc342bb741a6d.service": { "name": "systemd-cryptsetup@luk...ddd54\\x2d432f\\x2db831\\x2dc342bb741a6d.service", "source": "systemd", "state": "inactive", "status": "generated" }, "systemd-cryptsetup@luks\\x2dc3cb5b2a\\x2ddd54\\x2d432f\\x2db831\\x2dc342bb741a6d.service": { "name": "systemd-cryptsetup@luks\\x2dc3cb5b2a\\x2ddd54\\x2d432f\\x2db831\\x2dc342bb741a6d.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Friday 30 January 2026 18:54:52 -0500 (0:00:03.108) 0:18:31.750 ******** ok: [managed-node2] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2dc3cb5b2a\\x2ddd54\\x2d432f\\x2db831\\x2dc342bb741a6d.service", "systemd-cryptsetup@luk...ddd54\\x2d432f\\x2db831\\x2dc342bb741a6d.service" ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Friday 30 January 2026 18:54:52 -0500 (0:00:00.234) 0:18:31.984 ******** changed: [managed-node2] => (item=systemd-cryptsetup@luks\x2dc3cb5b2a\x2ddd54\x2d432f\x2db831\x2dc342bb741a6d.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2dc3cb5b2a\\x2ddd54\\x2d432f\\x2db831\\x2dc342bb741a6d.service", "name": "systemd-cryptsetup@luks\\x2dc3cb5b2a\\x2ddd54\\x2d432f\\x2db831\\x2dc342bb741a6d.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "system-systemd\\x2dcryptsetup.slice dev-mapper-foo\\x2dtest1.device cryptsetup-pre.target systemd-journald.socket", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "cryptsetup.target umount.target", "BindsTo": "dev-mapper-foo\\x2dtest1.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-c3cb5b2a-dd54-432f-b831-c342bb741a6d", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-c3cb5b2a-dd54-432f-b831-c342bb741a6d /dev/mapper/foo-test1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-c3cb5b2a-dd54-432f-b831-c342bb741a6d ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2dc3cb5b2a\\x2ddd54\\x2d432f\\x2db831\\x2dc342bb741a6d.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2dc3cb5b2a\\x2ddd54\\x2d432f\\x2db831\\x2dc342bb741a6d.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2dc3cb5b2a\\x2ddd54\\x2d432f\\x2db831\\x2dc342bb741a6d.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target dev-mapper-luks\\x2dc3cb5b2a\\x2ddd54\\x2d432f\\x2db831\\x2dc342bb741a6d.device", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Fri 2026-01-30 18:52:26 EST", "StateChangeTimestampMonotonic": "2552175322", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "infinity", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "WantedBy": "dev-mapper-foo\\x2dtest1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node2] => (item=systemd-cryptsetup@luk...ddd54\x2d432f\x2db831\x2dc342bb741a6d.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...ddd54\\x2d432f\\x2db831\\x2dc342bb741a6d.service", "name": "systemd-cryptsetup@luk...ddd54\\x2d432f\\x2db831\\x2dc342bb741a6d.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...ddd54\\x2d432f\\x2db831\\x2dc342bb741a6d.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...ddd54\\x2d432f\\x2db831\\x2dc342bb741a6d.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.NoSuchUnit \"Unit systemd-cryptsetup@luk...ddd54\\x2d432f\\x2db831\\x2dc342bb741a6d.service not found.\"", "LoadState": "not-found", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...ddd54\\x2d432f\\x2db831\\x2dc342bb741a6d.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Friday 30 January 2026 18:54:56 -0500 (0:00:03.372) 0:18:35.357 ******** changed: [managed-node2] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-c3cb5b2a-dd54-432f-b831-c342bb741a6d", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-c3cb5b2a-dd54-432f-b831-c342bb741a6d", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-c3cb5b2a-dd54-432f-b831-c342bb741a6d", "password": "-", "state": "absent" } ], "leaves": [ "/dev/mapper/foo-test1", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-c3cb5b2a-dd54-432f-b831-c342bb741a6d", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "lvm2", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 Friday 30 January 2026 18:55:02 -0500 (0:00:06.016) 0:18:41.373 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Friday 30 January 2026 18:55:02 -0500 (0:00:00.240) 0:18:41.614 ******** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1769817044.8717926, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "e6372f652392f3f5edae66914a466d22ea65c4ce", "ctime": 1769817044.8687925, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 322961545, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1769817044.8687925, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1425, "uid": 0, "version": "3166550135", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:97 Friday 30 January 2026 18:55:03 -0500 (0:00:01.215) 0:18:42.830 ******** ok: [managed-node2] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Friday 30 January 2026 18:55:05 -0500 (0:00:01.751) 0:18:44.582 ******** changed: [managed-node2] => (item=systemd-cryptsetup@luks\x2dc3cb5b2a\x2ddd54\x2d432f\x2db831\x2dc342bb741a6d.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2dc3cb5b2a\\x2ddd54\\x2d432f\\x2db831\\x2dc342bb741a6d.service", "name": "systemd-cryptsetup@luks\\x2dc3cb5b2a\\x2ddd54\\x2d432f\\x2db831\\x2dc342bb741a6d.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2dc3cb5b2a\\x2ddd54\\x2d432f\\x2db831\\x2dc342bb741a6d.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2dc3cb5b2a\\x2ddd54\\x2d432f\\x2db831\\x2dc342bb741a6d.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2dc3cb5b2a\\x2ddd54\\x2d432f\\x2db831\\x2dc342bb741a6d.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2dc3cb5b2a\\x2ddd54\\x2d432f\\x2db831\\x2dc342bb741a6d.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "RequiredBy": "dev-mapper-luks\\x2dc3cb5b2a\\x2ddd54\\x2d432f\\x2db831\\x2dc342bb741a6d.device cryptsetup.target", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Fri 2026-01-30 18:52:26 EST", "StateChangeTimestampMonotonic": "2552175322", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WantedBy": "dev-mapper-foo\\x2dtest1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node2] => (item=systemd-cryptsetup@luk...ddd54\x2d432f\x2db831\x2dc342bb741a6d.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...ddd54\\x2d432f\\x2db831\\x2dc342bb741a6d.service", "name": "systemd-cryptsetup@luk...ddd54\\x2d432f\\x2db831\\x2dc342bb741a6d.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...ddd54\\x2d432f\\x2db831\\x2dc342bb741a6d.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...ddd54\\x2d432f\\x2db831\\x2dc342bb741a6d.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luk...ddd54\\x2d432f\\x2db831\\x2dc342bb741a6d.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...ddd54\\x2d432f\\x2db831\\x2dc342bb741a6d.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Friday 30 January 2026 18:55:08 -0500 (0:00:03.011) 0:18:47.594 ******** ok: [managed-node2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-c3cb5b2a-dd54-432f-b831-c342bb741a6d", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-c3cb5b2a-dd54-432f-b831-c342bb741a6d", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-c3cb5b2a-dd54-432f-b831-c342bb741a6d", "password": "-", "state": "absent" } ], "failed": false, "leaves": [ "/dev/mapper/foo-test1", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-c3cb5b2a-dd54-432f-b831-c342bb741a6d", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "lvm2", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:130 Friday 30 January 2026 18:55:08 -0500 (0:00:00.234) 0:18:47.829 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:134 Friday 30 January 2026 18:55:08 -0500 (0:00:00.191) 0:18:48.020 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:150 Friday 30 January 2026 18:55:08 -0500 (0:00:00.256) 0:18:48.276 ******** changed: [managed-node2] => (item={'src': '/dev/mapper/luks-c3cb5b2a-dd54-432f-b831-c342bb741a6d', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-c3cb5b2a-dd54-432f-b831-c342bb741a6d", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-c3cb5b2a-dd54-432f-b831-c342bb741a6d" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Friday 30 January 2026 18:55:10 -0500 (0:00:01.388) 0:18:49.665 ******** ok: [managed-node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166 Friday 30 January 2026 18:55:12 -0500 (0:00:01.725) 0:18:51.391 ******** changed: [managed-node2] => (item={'src': '/dev/mapper/foo-test1', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:177 Friday 30 January 2026 18:55:13 -0500 (0:00:01.517) 0:18:52.908 ******** skipping: [managed-node2] => (item={'src': '/dev/mapper/foo-test1', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:189 Friday 30 January 2026 18:55:13 -0500 (0:00:00.219) 0:18:53.127 ******** ok: [managed-node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 Friday 30 January 2026 18:55:15 -0500 (0:00:01.649) 0:18:54.777 ******** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1769817058.1927595, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "d88a458f6ec2ff311d8d233aaa04382122634008", "ctime": 1769817052.1627746, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 354418821, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1769817052.1617744, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 66, "uid": 0, "version": "1583496756", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:202 Friday 30 January 2026 18:55:16 -0500 (0:00:01.374) 0:18:56.151 ******** changed: [managed-node2] => (item={'backing_device': '/dev/mapper/foo-test1', 'name': 'luks-c3cb5b2a-dd54-432f-b831-c342bb741a6d', 'password': '-', 'state': 'absent'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/mapper/foo-test1", "name": "luks-c3cb5b2a-dd54-432f-b831-c342bb741a6d", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 Friday 30 January 2026 18:55:18 -0500 (0:00:01.447) 0:18:57.599 ******** ok: [managed-node2] TASK [Verify role results - 9] ************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:477 Friday 30 January 2026 18:55:20 -0500 (0:00:02.026) 0:18:59.626 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node2 TASK [Print out pool information] ********************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Friday 30 January 2026 18:55:20 -0500 (0:00:00.533) 0:19:00.159 ******** ok: [managed-node2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Friday 30 January 2026 18:55:21 -0500 (0:00:00.245) 0:19:00.404 ******** skipping: [managed-node2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Friday 30 January 2026 18:55:21 -0500 (0:00:00.211) 0:19:00.616 ******** ok: [managed-node2] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "811b9896-15d3-4e73-8af8-ba5e30630656" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "iWbLf2-pI6s-fAVq-vldH-x42y-4tNs-WFOSje" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Friday 30 January 2026 18:55:22 -0500 (0:00:01.179) 0:19:01.796 ******** ok: [managed-node2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002574", "end": "2026-01-30 18:55:23.665897", "rc": 0, "start": "2026-01-30 18:55:23.663323" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/foo-test1 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Friday 30 January 2026 18:55:24 -0500 (0:00:01.513) 0:19:03.309 ******** ok: [managed-node2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002499", "end": "2026-01-30 18:55:25.219315", "failed_when_result": false, "rc": 0, "start": "2026-01-30 18:55:25.216816" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Friday 30 January 2026 18:55:25 -0500 (0:00:01.513) 0:19:04.822 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Friday 30 January 2026 18:55:25 -0500 (0:00:00.401) 0:19:05.224 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Friday 30 January 2026 18:55:26 -0500 (0:00:00.223) 0:19:05.448 ******** ok: [managed-node2] => { "changed": false, "cmd": [ "vgs", "--noheadings", "--binary", "-o", "shared", "foo" ], "delta": "0:00:00.024588", "end": "2026-01-30 18:55:27.459472", "rc": 0, "start": "2026-01-30 18:55:27.434884" } STDOUT: 0 TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Friday 30 January 2026 18:55:27 -0500 (0:00:01.540) 0:19:06.989 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Friday 30 January 2026 18:55:28 -0500 (0:00:00.353) 0:19:07.342 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node2 included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node2 TASK [Set test variables] ****************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Friday 30 January 2026 18:55:28 -0500 (0:00:00.373) 0:19:07.715 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Friday 30 January 2026 18:55:28 -0500 (0:00:00.328) 0:19:08.044 ******** ok: [managed-node2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Friday 30 January 2026 18:55:30 -0500 (0:00:01.726) 0:19:09.770 ******** ok: [managed-node2] => { "ansible_facts": { "__pvs_lvm_len": "1" }, "changed": false } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Friday 30 January 2026 18:55:30 -0500 (0:00:00.233) 0:19:10.004 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "changed": false } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Friday 30 January 2026 18:55:31 -0500 (0:00:00.304) 0:19:10.308 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Set expected pv type] **************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Friday 30 January 2026 18:55:31 -0500 (0:00:00.300) 0:19:10.609 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type - 2] ************************************************ task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Friday 30 January 2026 18:55:31 -0500 (0:00:00.341) 0:19:10.951 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type - 3] ************************************************ task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Friday 30 January 2026 18:55:31 -0500 (0:00:00.326) 0:19:11.277 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:55 Friday 30 January 2026 18:55:32 -0500 (0:00:00.231) 0:19:11.509 ******** ok: [managed-node2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:68 Friday 30 January 2026 18:55:32 -0500 (0:00:00.424) 0:19:11.933 ******** ok: [managed-node2] => { "changed": false, "failed_when_result": false, "rc": 1 } STDERR: Shared connection to 10.31.47.227 closed. MSG: non-zero return code TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:78 Friday 30 January 2026 18:55:34 -0500 (0:00:01.691) 0:19:13.625 ******** skipping: [managed-node2] => (item=/dev/sda) => { "ansible_loop_var": "st_pool_pv", "changed": false, "skip_reason": "Conditional result was False", "st_pool_pv": "/dev/sda" } TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:88 Friday 30 January 2026 18:55:34 -0500 (0:00:00.233) 0:19:13.858 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node2 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Friday 30 January 2026 18:55:34 -0500 (0:00:00.405) 0:19:14.263 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Friday 30 January 2026 18:55:35 -0500 (0:00:00.335) 0:19:14.599 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Friday 30 January 2026 18:55:35 -0500 (0:00:00.189) 0:19:14.788 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Friday 30 January 2026 18:55:35 -0500 (0:00:00.184) 0:19:14.973 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Friday 30 January 2026 18:55:35 -0500 (0:00:00.283) 0:19:15.256 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Friday 30 January 2026 18:55:36 -0500 (0:00:00.316) 0:19:15.572 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Friday 30 January 2026 18:55:36 -0500 (0:00:00.121) 0:19:15.694 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Friday 30 January 2026 18:55:36 -0500 (0:00:00.275) 0:19:15.970 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Friday 30 January 2026 18:55:36 -0500 (0:00:00.200) 0:19:16.170 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Friday 30 January 2026 18:55:37 -0500 (0:00:00.145) 0:19:16.315 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Friday 30 January 2026 18:55:37 -0500 (0:00:00.185) 0:19:16.501 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:91 Friday 30 January 2026 18:55:37 -0500 (0:00:00.226) 0:19:16.727 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Friday 30 January 2026 18:55:37 -0500 (0:00:00.398) 0:19:17.126 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml for managed-node2 TASK [Get information about the LV] ******************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:8 Friday 30 January 2026 18:55:38 -0500 (0:00:00.355) 0:19:17.482 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:16 Friday 30 January 2026 18:55:38 -0500 (0:00:00.152) 0:19:17.634 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:20 Friday 30 January 2026 18:55:38 -0500 (0:00:00.228) 0:19:17.863 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV stripe size] ****************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:27 Friday 30 January 2026 18:55:38 -0500 (0:00:00.222) 0:19:18.086 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested stripe size] ***************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:31 Friday 30 January 2026 18:55:39 -0500 (0:00:00.213) 0:19:18.299 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected stripe size] ************************************************ task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:37 Friday 30 January 2026 18:55:39 -0500 (0:00:00.210) 0:19:18.510 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check stripe size] ******************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:42 Friday 30 January 2026 18:55:39 -0500 (0:00:00.156) 0:19:18.666 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:94 Friday 30 January 2026 18:55:39 -0500 (0:00:00.109) 0:19:18.776 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node2 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Friday 30 January 2026 18:55:39 -0500 (0:00:00.409) 0:19:19.185 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml for managed-node2 TASK [Get information about thinpool] ****************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:8 Friday 30 January 2026 18:55:40 -0500 (0:00:00.204) 0:19:19.390 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in correct thinpool (when thinp name is provided)] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:16 Friday 30 January 2026 18:55:40 -0500 (0:00:00.110) 0:19:19.500 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in thinpool (when thinp name is not provided)] ****** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:22 Friday 30 January 2026 18:55:40 -0500 (0:00:00.145) 0:19:19.646 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:26 Friday 30 January 2026 18:55:40 -0500 (0:00:00.251) 0:19:19.897 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_thin_status": null }, "changed": false } TASK [Check member encryption] ************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:97 Friday 30 January 2026 18:55:40 -0500 (0:00:00.218) 0:19:20.116 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node2 TASK [Set test variables] ****************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Friday 30 January 2026 18:55:41 -0500 (0:00:00.990) 0:19:21.106 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Friday 30 January 2026 18:55:42 -0500 (0:00:00.231) 0:19:21.337 ******** skipping: [managed-node2] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Friday 30 January 2026 18:55:42 -0500 (0:00:00.240) 0:19:21.578 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml for managed-node2 TASK [Set variables used by tests] ********************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:2 Friday 30 January 2026 18:55:42 -0500 (0:00:00.412) 0:19:21.990 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:6 Friday 30 January 2026 18:55:42 -0500 (0:00:00.183) 0:19:22.173 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:14 Friday 30 January 2026 18:55:43 -0500 (0:00:00.260) 0:19:22.433 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:23 Friday 30 January 2026 18:55:43 -0500 (0:00:00.169) 0:19:22.603 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:32 Friday 30 January 2026 18:55:43 -0500 (0:00:00.117) 0:19:22.720 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:41 Friday 30 January 2026 18:55:43 -0500 (0:00:00.120) 0:19:22.841 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [Clear test variables] **************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Friday 30 January 2026 18:55:43 -0500 (0:00:00.186) 0:19:23.028 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:100 Friday 30 January 2026 18:55:43 -0500 (0:00:00.138) 0:19:23.167 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Friday 30 January 2026 18:55:44 -0500 (0:00:00.356) 0:19:23.523 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml for managed-node2 TASK [Get information about VDO deduplication] ********************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:8 Friday 30 January 2026 18:55:44 -0500 (0:00:00.347) 0:19:23.871 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:15 Friday 30 January 2026 18:55:44 -0500 (0:00:00.213) 0:19:24.084 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:21 Friday 30 January 2026 18:55:45 -0500 (0:00:00.265) 0:19:24.349 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about VDO compression] *********************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:27 Friday 30 January 2026 18:55:45 -0500 (0:00:00.173) 0:19:24.523 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off - 2] *********************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:34 Friday 30 January 2026 18:55:45 -0500 (0:00:00.210) 0:19:24.734 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on - 2] ************************************ task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:40 Friday 30 January 2026 18:55:45 -0500 (0:00:00.217) 0:19:24.951 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:46 Friday 30 January 2026 18:55:45 -0500 (0:00:00.170) 0:19:25.122 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:103 Friday 30 January 2026 18:55:46 -0500 (0:00:00.242) 0:19:25.364 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node2 TASK [Get stratis pool information] ******************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Friday 30 January 2026 18:55:46 -0500 (0:00:00.416) 0:19:25.781 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print script output] ***************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Friday 30 January 2026 18:55:46 -0500 (0:00:00.143) 0:19:25.924 ******** skipping: [managed-node2] => {} TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:19 Friday 30 January 2026 18:55:46 -0500 (0:00:00.171) 0:19:26.095 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:23 Friday 30 January 2026 18:55:47 -0500 (0:00:00.203) 0:19:26.299 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:30 Friday 30 January 2026 18:55:47 -0500 (0:00:00.166) 0:19:26.466 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:39 Friday 30 January 2026 18:55:47 -0500 (0:00:00.242) 0:19:26.708 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:49 Friday 30 January 2026 18:55:47 -0500 (0:00:00.157) 0:19:26.865 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:106 Friday 30 January 2026 18:55:47 -0500 (0:00:00.268) 0:19:27.134 ******** ok: [managed-node2] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Friday 30 January 2026 18:55:48 -0500 (0:00:00.234) 0:19:27.368 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node2 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Friday 30 January 2026 18:55:48 -0500 (0:00:00.411) 0:19:27.780 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Friday 30 January 2026 18:55:48 -0500 (0:00:00.232) 0:19:28.012 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node2 included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node2 included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node2 included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node2 included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node2 included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node2 included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node2 included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Friday 30 January 2026 18:55:49 -0500 (0:00:01.122) 0:19:29.134 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Friday 30 January 2026 18:55:50 -0500 (0:00:00.226) 0:19:29.361 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Friday 30 January 2026 18:55:50 -0500 (0:00:00.294) 0:19:29.655 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Friday 30 January 2026 18:55:50 -0500 (0:00:00.300) 0:19:29.955 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Friday 30 January 2026 18:55:50 -0500 (0:00:00.135) 0:19:30.091 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Friday 30 January 2026 18:55:51 -0500 (0:00:00.208) 0:19:30.299 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Friday 30 January 2026 18:55:51 -0500 (0:00:00.196) 0:19:30.496 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Friday 30 January 2026 18:55:51 -0500 (0:00:00.131) 0:19:30.627 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Friday 30 January 2026 18:55:51 -0500 (0:00:00.157) 0:19:30.785 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Friday 30 January 2026 18:55:51 -0500 (0:00:00.174) 0:19:30.959 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Friday 30 January 2026 18:55:51 -0500 (0:00:00.199) 0:19:31.158 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Friday 30 January 2026 18:55:52 -0500 (0:00:00.317) 0:19:31.476 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/foo-test1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Friday 30 January 2026 18:55:52 -0500 (0:00:00.567) 0:19:32.044 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Friday 30 January 2026 18:55:53 -0500 (0:00:00.261) 0:19:32.305 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Friday 30 January 2026 18:55:53 -0500 (0:00:00.308) 0:19:32.614 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Friday 30 January 2026 18:55:53 -0500 (0:00:00.186) 0:19:32.801 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Friday 30 January 2026 18:55:53 -0500 (0:00:00.319) 0:19:33.121 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Friday 30 January 2026 18:55:53 -0500 (0:00:00.137) 0:19:33.258 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Friday 30 January 2026 18:55:54 -0500 (0:00:00.325) 0:19:33.583 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Friday 30 January 2026 18:55:54 -0500 (0:00:00.351) 0:19:33.934 ******** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1769817301.6521516, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1769817301.6521516, "dev": 6, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 252262, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1769817301.6521516, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Friday 30 January 2026 18:55:55 -0500 (0:00:01.285) 0:19:35.220 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Friday 30 January 2026 18:55:56 -0500 (0:00:00.228) 0:19:35.448 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Friday 30 January 2026 18:55:56 -0500 (0:00:00.205) 0:19:35.654 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Friday 30 January 2026 18:55:56 -0500 (0:00:00.247) 0:19:35.902 ******** ok: [managed-node2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Friday 30 January 2026 18:55:56 -0500 (0:00:00.212) 0:19:36.114 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Friday 30 January 2026 18:55:57 -0500 (0:00:00.220) 0:19:36.335 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Friday 30 January 2026 18:55:57 -0500 (0:00:00.197) 0:19:36.533 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Friday 30 January 2026 18:55:57 -0500 (0:00:00.240) 0:19:36.773 ******** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Friday 30 January 2026 18:56:01 -0500 (0:00:03.968) 0:19:40.742 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Friday 30 January 2026 18:56:01 -0500 (0:00:00.126) 0:19:40.868 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Friday 30 January 2026 18:56:01 -0500 (0:00:00.140) 0:19:41.008 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Friday 30 January 2026 18:56:02 -0500 (0:00:00.287) 0:19:41.296 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Friday 30 January 2026 18:56:02 -0500 (0:00:00.166) 0:19:41.462 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Friday 30 January 2026 18:56:02 -0500 (0:00:00.118) 0:19:41.581 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Friday 30 January 2026 18:56:02 -0500 (0:00:00.187) 0:19:41.768 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Friday 30 January 2026 18:56:02 -0500 (0:00:00.187) 0:19:41.956 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Friday 30 January 2026 18:56:02 -0500 (0:00:00.223) 0:19:42.179 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Friday 30 January 2026 18:56:03 -0500 (0:00:00.218) 0:19:42.398 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Friday 30 January 2026 18:56:03 -0500 (0:00:00.216) 0:19:42.614 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Friday 30 January 2026 18:56:03 -0500 (0:00:00.204) 0:19:42.819 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Friday 30 January 2026 18:56:03 -0500 (0:00:00.223) 0:19:43.043 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Friday 30 January 2026 18:56:03 -0500 (0:00:00.131) 0:19:43.174 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Friday 30 January 2026 18:56:04 -0500 (0:00:00.222) 0:19:43.396 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Friday 30 January 2026 18:56:04 -0500 (0:00:00.240) 0:19:43.637 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Friday 30 January 2026 18:56:04 -0500 (0:00:00.183) 0:19:43.820 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Friday 30 January 2026 18:56:04 -0500 (0:00:00.261) 0:19:44.082 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Friday 30 January 2026 18:56:05 -0500 (0:00:00.838) 0:19:44.920 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Friday 30 January 2026 18:56:05 -0500 (0:00:00.122) 0:19:45.042 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Friday 30 January 2026 18:56:05 -0500 (0:00:00.214) 0:19:45.257 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Friday 30 January 2026 18:56:06 -0500 (0:00:00.217) 0:19:45.474 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Friday 30 January 2026 18:56:06 -0500 (0:00:00.225) 0:19:45.699 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Friday 30 January 2026 18:56:06 -0500 (0:00:00.276) 0:19:45.976 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Friday 30 January 2026 18:56:06 -0500 (0:00:00.152) 0:19:46.129 ******** ok: [managed-node2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Friday 30 January 2026 18:56:08 -0500 (0:00:01.189) 0:19:47.318 ******** ok: [managed-node2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Friday 30 January 2026 18:56:09 -0500 (0:00:01.065) 0:19:48.384 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [Show expected size] ****************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Friday 30 January 2026 18:56:09 -0500 (0:00:00.315) 0:19:48.700 ******** ok: [managed-node2] => { "storage_test_expected_size": "4294967296" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Friday 30 January 2026 18:56:09 -0500 (0:00:00.145) 0:19:48.846 ******** ok: [managed-node2] => { "bytes": 10726680821, "changed": false, "lvm": "9g", "parted": "9GiB", "size": "9 GiB" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Friday 30 January 2026 18:56:10 -0500 (0:00:01.369) 0:19:50.216 ******** skipping: [managed-node2] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Friday 30 January 2026 18:56:11 -0500 (0:00:00.170) 0:19:50.387 ******** skipping: [managed-node2] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Friday 30 January 2026 18:56:11 -0500 (0:00:00.142) 0:19:50.529 ******** skipping: [managed-node2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Friday 30 January 2026 18:56:11 -0500 (0:00:00.270) 0:19:50.799 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Friday 30 January 2026 18:56:11 -0500 (0:00:00.269) 0:19:51.069 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Friday 30 January 2026 18:56:12 -0500 (0:00:00.239) 0:19:51.308 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Friday 30 January 2026 18:56:12 -0500 (0:00:00.181) 0:19:51.490 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Friday 30 January 2026 18:56:12 -0500 (0:00:00.242) 0:19:51.732 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Friday 30 January 2026 18:56:12 -0500 (0:00:00.219) 0:19:51.951 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Friday 30 January 2026 18:56:12 -0500 (0:00:00.131) 0:19:52.083 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Friday 30 January 2026 18:56:13 -0500 (0:00:00.218) 0:19:52.301 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Friday 30 January 2026 18:56:13 -0500 (0:00:00.298) 0:19:52.599 ******** skipping: [managed-node2] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Friday 30 January 2026 18:56:13 -0500 (0:00:00.134) 0:19:52.734 ******** skipping: [managed-node2] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Friday 30 January 2026 18:56:13 -0500 (0:00:00.195) 0:19:52.930 ******** skipping: [managed-node2] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Friday 30 January 2026 18:56:13 -0500 (0:00:00.161) 0:19:53.091 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Friday 30 January 2026 18:56:13 -0500 (0:00:00.174) 0:19:53.265 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Friday 30 January 2026 18:56:14 -0500 (0:00:00.198) 0:19:53.464 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Friday 30 January 2026 18:56:14 -0500 (0:00:00.193) 0:19:53.657 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Friday 30 January 2026 18:56:14 -0500 (0:00:00.234) 0:19:53.892 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Friday 30 January 2026 18:56:14 -0500 (0:00:00.174) 0:19:54.066 ******** ok: [managed-node2] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Friday 30 January 2026 18:56:14 -0500 (0:00:00.145) 0:19:54.212 ******** ok: [managed-node2] => { "storage_test_expected_size": "4294967296" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Friday 30 January 2026 18:56:15 -0500 (0:00:00.113) 0:19:54.325 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Friday 30 January 2026 18:56:15 -0500 (0:00:00.242) 0:19:54.568 ******** ok: [managed-node2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.025307", "end": "2026-01-30 18:56:16.550324", "rc": 0, "start": "2026-01-30 18:56:16.525017" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Friday 30 January 2026 18:56:16 -0500 (0:00:01.411) 0:19:55.980 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [Check segment type] ****************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Friday 30 January 2026 18:56:16 -0500 (0:00:00.223) 0:19:56.204 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Friday 30 January 2026 18:56:17 -0500 (0:00:00.168) 0:19:56.373 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Friday 30 January 2026 18:56:17 -0500 (0:00:00.163) 0:19:56.536 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Friday 30 January 2026 18:56:17 -0500 (0:00:00.167) 0:19:56.704 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Friday 30 January 2026 18:56:17 -0500 (0:00:00.180) 0:19:56.884 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Friday 30 January 2026 18:56:17 -0500 (0:00:00.282) 0:19:57.167 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Friday 30 January 2026 18:56:18 -0500 (0:00:00.168) 0:19:57.335 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Friday 30 January 2026 18:56:18 -0500 (0:00:00.109) 0:19:57.445 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Friday 30 January 2026 18:56:18 -0500 (0:00:00.238) 0:19:57.684 ******** changed: [managed-node2] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode - 6] ****************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:483 Friday 30 January 2026 18:56:20 -0500 (0:00:01.642) 0:19:59.327 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node2 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Friday 30 January 2026 18:56:20 -0500 (0:00:00.346) 0:19:59.673 ******** ok: [managed-node2] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Friday 30 January 2026 18:56:20 -0500 (0:00:00.315) 0:19:59.989 ******** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Friday 30 January 2026 18:56:21 -0500 (0:00:00.355) 0:20:00.345 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Friday 30 January 2026 18:56:21 -0500 (0:00:00.294) 0:20:00.639 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Friday 30 January 2026 18:56:21 -0500 (0:00:00.231) 0:20:00.871 ******** skipping: [managed-node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node2] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node2] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Friday 30 January 2026 18:56:22 -0500 (0:00:00.435) 0:20:01.306 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Friday 30 January 2026 18:56:22 -0500 (0:00:00.200) 0:20:01.507 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Friday 30 January 2026 18:56:22 -0500 (0:00:00.303) 0:20:01.811 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Friday 30 January 2026 18:56:22 -0500 (0:00:00.194) 0:20:02.005 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Friday 30 January 2026 18:56:22 -0500 (0:00:00.195) 0:20:02.200 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Friday 30 January 2026 18:56:23 -0500 (0:00:00.431) 0:20:02.631 ******** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Friday 30 January 2026 18:56:27 -0500 (0:00:04.326) 0:20:06.958 ******** ok: [managed-node2] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": true, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Friday 30 January 2026 18:56:27 -0500 (0:00:00.216) 0:20:07.174 ******** ok: [managed-node2] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Friday 30 January 2026 18:56:28 -0500 (0:00:00.194) 0:20:07.369 ******** ok: [managed-node2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup", "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Friday 30 January 2026 18:56:33 -0500 (0:00:05.135) 0:20:12.504 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Friday 30 January 2026 18:56:33 -0500 (0:00:00.227) 0:20:12.731 ******** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Friday 30 January 2026 18:56:33 -0500 (0:00:00.101) 0:20:12.832 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Friday 30 January 2026 18:56:33 -0500 (0:00:00.210) 0:20:13.043 ******** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Friday 30 January 2026 18:56:33 -0500 (0:00:00.166) 0:20:13.210 ******** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Friday 30 January 2026 18:56:38 -0500 (0:00:04.573) 0:20:17.783 ******** ok: [managed-node2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luk...ddd54\\x2d432f\\x2db831\\x2dc342bb741a6d.service": { "name": "systemd-cryptsetup@luk...ddd54\\x2d432f\\x2db831\\x2dc342bb741a6d.service", "source": "systemd", "state": "inactive", "status": "generated" }, "systemd-cryptsetup@luks\\x2dc3cb5b2a\\x2ddd54\\x2d432f\\x2db831\\x2dc342bb741a6d.service": { "name": "systemd-cryptsetup@luks\\x2dc3cb5b2a\\x2ddd54\\x2d432f\\x2db831\\x2dc342bb741a6d.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Friday 30 January 2026 18:56:41 -0500 (0:00:03.024) 0:20:20.808 ******** ok: [managed-node2] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2dc3cb5b2a\\x2ddd54\\x2d432f\\x2db831\\x2dc342bb741a6d.service", "systemd-cryptsetup@luk...ddd54\\x2d432f\\x2db831\\x2dc342bb741a6d.service" ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Friday 30 January 2026 18:56:41 -0500 (0:00:00.299) 0:20:21.108 ******** changed: [managed-node2] => (item=systemd-cryptsetup@luks\x2dc3cb5b2a\x2ddd54\x2d432f\x2db831\x2dc342bb741a6d.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2dc3cb5b2a\\x2ddd54\\x2d432f\\x2db831\\x2dc342bb741a6d.service", "name": "systemd-cryptsetup@luks\\x2dc3cb5b2a\\x2ddd54\\x2d432f\\x2db831\\x2dc342bb741a6d.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "dev-mapper-foo\\x2dtest1.device cryptsetup-pre.target system-systemd\\x2dcryptsetup.slice systemd-journald.socket", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "umount.target cryptsetup.target", "BindsTo": "dev-mapper-foo\\x2dtest1.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-c3cb5b2a-dd54-432f-b831-c342bb741a6d", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-c3cb5b2a-dd54-432f-b831-c342bb741a6d /dev/mapper/foo-test1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-c3cb5b2a-dd54-432f-b831-c342bb741a6d ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2dc3cb5b2a\\x2ddd54\\x2d432f\\x2db831\\x2dc342bb741a6d.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2dc3cb5b2a\\x2ddd54\\x2d432f\\x2db831\\x2dc342bb741a6d.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2dc3cb5b2a\\x2ddd54\\x2d432f\\x2db831\\x2dc342bb741a6d.service", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Fri 2026-01-30 18:52:26 EST", "StateChangeTimestampMonotonic": "2552175322", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "infinity", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "WantedBy": "dev-mapper-foo\\x2dtest1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node2] => (item=systemd-cryptsetup@luk...ddd54\x2d432f\x2db831\x2dc342bb741a6d.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...ddd54\\x2d432f\\x2db831\\x2dc342bb741a6d.service", "name": "systemd-cryptsetup@luk...ddd54\\x2d432f\\x2db831\\x2dc342bb741a6d.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...ddd54\\x2d432f\\x2db831\\x2dc342bb741a6d.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...ddd54\\x2d432f\\x2db831\\x2dc342bb741a6d.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.NoSuchUnit \"Unit systemd-cryptsetup@luk...ddd54\\x2d432f\\x2db831\\x2dc342bb741a6d.service not found.\"", "LoadState": "not-found", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...ddd54\\x2d432f\\x2db831\\x2dc342bb741a6d.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Friday 30 January 2026 18:56:45 -0500 (0:00:03.462) 0:20:24.571 ******** fatal: [managed-node2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'test1' in safe mode due to adding encryption TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111 Friday 30 January 2026 18:56:51 -0500 (0:00:06.043) 0:20:30.615 ******** fatal: [managed-node2]: FAILED! => { "changed": false } MSG: {'msg': "cannot remove existing formatting on device 'test1' in safe mode due to adding encryption", 'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'invocation': {'module_args': {'pools': [{'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'lvm', 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None}]}], 'volumes': [], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': True, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Friday 30 January 2026 18:56:51 -0500 (0:00:00.302) 0:20:30.917 ******** changed: [managed-node2] => (item=systemd-cryptsetup@luks\x2dc3cb5b2a\x2ddd54\x2d432f\x2db831\x2dc342bb741a6d.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2dc3cb5b2a\\x2ddd54\\x2d432f\\x2db831\\x2dc342bb741a6d.service", "name": "systemd-cryptsetup@luks\\x2dc3cb5b2a\\x2ddd54\\x2d432f\\x2db831\\x2dc342bb741a6d.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2dc3cb5b2a\\x2ddd54\\x2d432f\\x2db831\\x2dc342bb741a6d.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2dc3cb5b2a\\x2ddd54\\x2d432f\\x2db831\\x2dc342bb741a6d.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2dc3cb5b2a\\x2ddd54\\x2d432f\\x2db831\\x2dc342bb741a6d.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2dc3cb5b2a\\x2ddd54\\x2d432f\\x2db831\\x2dc342bb741a6d.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node2] => (item=systemd-cryptsetup@luk...ddd54\x2d432f\x2db831\x2dc342bb741a6d.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...ddd54\\x2d432f\\x2db831\\x2dc342bb741a6d.service", "name": "systemd-cryptsetup@luk...ddd54\\x2d432f\\x2db831\\x2dc342bb741a6d.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...ddd54\\x2d432f\\x2db831\\x2dc342bb741a6d.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...ddd54\\x2d432f\\x2db831\\x2dc342bb741a6d.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14004", "LimitNPROCSoft": "14004", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14004", "LimitSIGPENDINGSoft": "14004", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luk...ddd54\\x2d432f\\x2db831\\x2dc342bb741a6d.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...ddd54\\x2d432f\\x2db831\\x2dc342bb741a6d.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22406", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:23 Friday 30 January 2026 18:56:54 -0500 (0:00:03.269) 0:20:34.187 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:28 Friday 30 January 2026 18:56:55 -0500 (0:00:00.276) 0:20:34.463 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:39 Friday 30 January 2026 18:56:55 -0500 (0:00:00.244) 0:20:34.708 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Friday 30 January 2026 18:56:55 -0500 (0:00:00.181) 0:20:34.889 ******** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1769817379.8039534, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1769817379.8039534, "dev": 64768, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1769817379.8039534, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "90005642", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Friday 30 January 2026 18:56:56 -0500 (0:00:01.377) 0:20:36.267 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Add encryption to the volume - 3] **************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:507 Friday 30 January 2026 18:56:57 -0500 (0:00:00.308) 0:20:36.576 ******** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Friday 30 January 2026 18:56:58 -0500 (0:00:00.976) 0:20:37.552 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Friday 30 January 2026 18:56:58 -0500 (0:00:00.377) 0:20:37.930 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Friday 30 January 2026 18:56:58 -0500 (0:00:00.205) 0:20:38.135 ******** skipping: [managed-node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node2] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node2] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Friday 30 January 2026 18:56:59 -0500 (0:00:00.668) 0:20:38.804 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Friday 30 January 2026 18:56:59 -0500 (0:00:00.227) 0:20:39.031 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Friday 30 January 2026 18:56:59 -0500 (0:00:00.182) 0:20:39.213 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Friday 30 January 2026 18:57:00 -0500 (0:00:00.160) 0:20:39.374 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Friday 30 January 2026 18:57:00 -0500 (0:00:00.153) 0:20:39.528 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Friday 30 January 2026 18:57:00 -0500 (0:00:00.446) 0:20:39.975 ******** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Friday 30 January 2026 18:57:05 -0500 (0:00:04.358) 0:20:44.333 ******** ok: [managed-node2] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": true, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Friday 30 January 2026 18:57:05 -0500 (0:00:00.285) 0:20:44.619 ******** ok: [managed-node2] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Friday 30 January 2026 18:57:05 -0500 (0:00:00.263) 0:20:44.882 ******** ok: [managed-node2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup", "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Friday 30 January 2026 18:57:10 -0500 (0:00:05.082) 0:20:49.965 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Friday 30 January 2026 18:57:11 -0500 (0:00:00.445) 0:20:50.410 ******** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Friday 30 January 2026 18:57:11 -0500 (0:00:00.135) 0:20:50.546 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Friday 30 January 2026 18:57:11 -0500 (0:00:00.188) 0:20:50.735 ******** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Friday 30 January 2026 18:57:11 -0500 (0:00:00.143) 0:20:50.879 ******** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Friday 30 January 2026 18:57:15 -0500 (0:00:04.181) 0:20:55.061 ******** ok: [managed-node2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Friday 30 January 2026 18:57:18 -0500 (0:00:02.317) 0:20:57.378 ******** ok: [managed-node2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Friday 30 January 2026 18:57:18 -0500 (0:00:00.304) 0:20:57.682 ******** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Friday 30 January 2026 18:57:18 -0500 (0:00:00.186) 0:20:57.869 ******** changed: [managed-node2] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-454a54a5-fe61-4dba-a3b5-b60deeb2f467", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-454a54a5-fe61-4dba-a3b5-b60deeb2f467", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-454a54a5-fe61-4dba-a3b5-b60deeb2f467", "password": "-", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-454a54a5-fe61-4dba-a3b5-b60deeb2f467" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-454a54a5-fe61-4dba-a3b5-b60deeb2f467", "state": "mounted" } ], "packages": [ "cryptsetup", "lvm2", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-454a54a5-fe61-4dba-a3b5-b60deeb2f467", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-454a54a5-fe61-4dba-a3b5-b60deeb2f467", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 Friday 30 January 2026 18:57:32 -0500 (0:00:13.808) 0:21:11.678 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Friday 30 January 2026 18:57:32 -0500 (0:00:00.150) 0:21:11.828 ******** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1769817313.4041219, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "a1522684f5b6a445a50f2611a4e0757a4aec1cf1", "ctime": 1769817313.4011219, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 322961545, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1769817313.4011219, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1393, "uid": 0, "version": "3166550135", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:97 Friday 30 January 2026 18:57:33 -0500 (0:00:01.100) 0:21:12.929 ******** ok: [managed-node2] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Friday 30 January 2026 18:57:35 -0500 (0:00:01.381) 0:21:14.310 ******** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Friday 30 January 2026 18:57:35 -0500 (0:00:00.149) 0:21:14.460 ******** ok: [managed-node2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-454a54a5-fe61-4dba-a3b5-b60deeb2f467", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-454a54a5-fe61-4dba-a3b5-b60deeb2f467", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-454a54a5-fe61-4dba-a3b5-b60deeb2f467", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-454a54a5-fe61-4dba-a3b5-b60deeb2f467" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-454a54a5-fe61-4dba-a3b5-b60deeb2f467", "state": "mounted" } ], "packages": [ "cryptsetup", "lvm2", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-454a54a5-fe61-4dba-a3b5-b60deeb2f467", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-454a54a5-fe61-4dba-a3b5-b60deeb2f467", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:130 Friday 30 January 2026 18:57:35 -0500 (0:00:00.239) 0:21:14.699 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-454a54a5-fe61-4dba-a3b5-b60deeb2f467", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-454a54a5-fe61-4dba-a3b5-b60deeb2f467", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:134 Friday 30 January 2026 18:57:35 -0500 (0:00:00.221) 0:21:14.921 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:150 Friday 30 January 2026 18:57:35 -0500 (0:00:00.184) 0:21:15.105 ******** changed: [managed-node2] => (item={'src': '/dev/mapper/foo-test1', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Friday 30 January 2026 18:57:37 -0500 (0:00:01.585) 0:21:16.691 ******** ok: [managed-node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166 Friday 30 January 2026 18:57:38 -0500 (0:00:01.492) 0:21:18.183 ******** changed: [managed-node2] => (item={'src': '/dev/mapper/luks-454a54a5-fe61-4dba-a3b5-b60deeb2f467', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-454a54a5-fe61-4dba-a3b5-b60deeb2f467", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-454a54a5-fe61-4dba-a3b5-b60deeb2f467" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:177 Friday 30 January 2026 18:57:40 -0500 (0:00:01.318) 0:21:19.502 ******** skipping: [managed-node2] => (item={'src': '/dev/mapper/luks-454a54a5-fe61-4dba-a3b5-b60deeb2f467', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-454a54a5-fe61-4dba-a3b5-b60deeb2f467", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:189 Friday 30 January 2026 18:57:40 -0500 (0:00:00.392) 0:21:19.894 ******** ok: [managed-node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 Friday 30 January 2026 18:57:42 -0500 (0:00:01.497) 0:21:21.391 ******** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1769817325.218092, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1769817318.0491102, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 127926477, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1769817318.04811, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "1931922327", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:202 Friday 30 January 2026 18:57:43 -0500 (0:00:01.307) 0:21:22.699 ******** changed: [managed-node2] => (item={'backing_device': '/dev/mapper/foo-test1', 'name': 'luks-454a54a5-fe61-4dba-a3b5-b60deeb2f467', 'password': '-', 'state': 'present'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/mapper/foo-test1", "name": "luks-454a54a5-fe61-4dba-a3b5-b60deeb2f467", "password": "-", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 Friday 30 January 2026 18:57:44 -0500 (0:00:01.342) 0:21:24.041 ******** ok: [managed-node2] TASK [Verify role results - 10] ************************************************ task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:524 Friday 30 January 2026 18:57:46 -0500 (0:00:01.537) 0:21:25.579 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node2 TASK [Print out pool information] ********************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Friday 30 January 2026 18:57:46 -0500 (0:00:00.092) 0:21:25.671 ******** ok: [managed-node2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-454a54a5-fe61-4dba-a3b5-b60deeb2f467", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-454a54a5-fe61-4dba-a3b5-b60deeb2f467", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Friday 30 January 2026 18:57:46 -0500 (0:00:00.555) 0:21:26.227 ******** skipping: [managed-node2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Friday 30 January 2026 18:57:47 -0500 (0:00:00.277) 0:21:26.504 ******** ok: [managed-node2] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "454a54a5-fe61-4dba-a3b5-b60deeb2f467" }, "/dev/mapper/luks-454a54a5-fe61-4dba-a3b5-b60deeb2f467": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-454a54a5-fe61-4dba-a3b5-b60deeb2f467", "size": "4G", "type": "crypt", "uuid": "c6d76669-748e-4e3f-945c-4ecd2ed706e6" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "iWbLf2-pI6s-fAVq-vldH-x42y-4tNs-WFOSje" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Friday 30 January 2026 18:57:48 -0500 (0:00:01.496) 0:21:28.000 ******** ok: [managed-node2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003351", "end": "2026-01-30 18:57:49.575260", "rc": 0, "start": "2026-01-30 18:57:49.571909" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-454a54a5-fe61-4dba-a3b5-b60deeb2f467 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Friday 30 January 2026 18:57:49 -0500 (0:00:01.153) 0:21:29.154 ******** ok: [managed-node2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002552", "end": "2026-01-30 18:57:50.961057", "failed_when_result": false, "rc": 0, "start": "2026-01-30 18:57:50.958505" } STDOUT: luks-454a54a5-fe61-4dba-a3b5-b60deeb2f467 /dev/mapper/foo-test1 - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Friday 30 January 2026 18:57:51 -0500 (0:00:01.440) 0:21:30.595 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Friday 30 January 2026 18:57:51 -0500 (0:00:00.305) 0:21:30.901 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Friday 30 January 2026 18:57:51 -0500 (0:00:00.237) 0:21:31.138 ******** ok: [managed-node2] => { "changed": false, "cmd": [ "vgs", "--noheadings", "--binary", "-o", "shared", "foo" ], "delta": "0:00:00.022879", "end": "2026-01-30 18:57:53.242173", "rc": 0, "start": "2026-01-30 18:57:53.219294" } STDOUT: 0 TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Friday 30 January 2026 18:57:53 -0500 (0:00:01.556) 0:21:32.694 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Friday 30 January 2026 18:57:53 -0500 (0:00:00.242) 0:21:32.937 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node2 included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node2 TASK [Set test variables] ****************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Friday 30 January 2026 18:57:53 -0500 (0:00:00.339) 0:21:33.277 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Friday 30 January 2026 18:57:54 -0500 (0:00:00.253) 0:21:33.530 ******** ok: [managed-node2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Friday 30 January 2026 18:57:55 -0500 (0:00:01.519) 0:21:35.049 ******** ok: [managed-node2] => { "ansible_facts": { "__pvs_lvm_len": "1" }, "changed": false } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Friday 30 January 2026 18:57:55 -0500 (0:00:00.213) 0:21:35.262 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "changed": false } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Friday 30 January 2026 18:57:56 -0500 (0:00:00.217) 0:21:35.479 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Set expected pv type] **************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Friday 30 January 2026 18:57:56 -0500 (0:00:00.169) 0:21:35.649 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type - 2] ************************************************ task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Friday 30 January 2026 18:57:56 -0500 (0:00:00.215) 0:21:35.865 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type - 3] ************************************************ task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Friday 30 January 2026 18:57:56 -0500 (0:00:00.172) 0:21:36.037 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:55 Friday 30 January 2026 18:57:56 -0500 (0:00:00.142) 0:21:36.180 ******** ok: [managed-node2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:68 Friday 30 January 2026 18:57:57 -0500 (0:00:00.210) 0:21:36.390 ******** ok: [managed-node2] => { "changed": false, "failed_when_result": false, "rc": 1 } STDERR: Shared connection to 10.31.47.227 closed. MSG: non-zero return code TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:78 Friday 30 January 2026 18:57:58 -0500 (0:00:01.107) 0:21:37.498 ******** skipping: [managed-node2] => (item=/dev/sda) => { "ansible_loop_var": "st_pool_pv", "changed": false, "skip_reason": "Conditional result was False", "st_pool_pv": "/dev/sda" } TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:88 Friday 30 January 2026 18:57:58 -0500 (0:00:00.242) 0:21:37.740 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node2 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Friday 30 January 2026 18:57:58 -0500 (0:00:00.217) 0:21:37.958 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Friday 30 January 2026 18:57:58 -0500 (0:00:00.074) 0:21:38.033 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Friday 30 January 2026 18:57:58 -0500 (0:00:00.042) 0:21:38.075 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Friday 30 January 2026 18:57:58 -0500 (0:00:00.041) 0:21:38.117 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Friday 30 January 2026 18:57:58 -0500 (0:00:00.067) 0:21:38.184 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Friday 30 January 2026 18:57:59 -0500 (0:00:00.120) 0:21:38.305 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Friday 30 January 2026 18:57:59 -0500 (0:00:00.050) 0:21:38.355 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Friday 30 January 2026 18:57:59 -0500 (0:00:00.098) 0:21:38.454 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Friday 30 January 2026 18:57:59 -0500 (0:00:00.236) 0:21:38.690 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Friday 30 January 2026 18:57:59 -0500 (0:00:00.047) 0:21:38.738 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Friday 30 January 2026 18:57:59 -0500 (0:00:00.041) 0:21:38.780 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:91 Friday 30 January 2026 18:57:59 -0500 (0:00:00.075) 0:21:38.856 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Friday 30 January 2026 18:57:59 -0500 (0:00:00.206) 0:21:39.062 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml for managed-node2 TASK [Get information about the LV] ******************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:8 Friday 30 January 2026 18:57:59 -0500 (0:00:00.167) 0:21:39.229 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:16 Friday 30 January 2026 18:58:00 -0500 (0:00:00.133) 0:21:39.362 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:20 Friday 30 January 2026 18:58:00 -0500 (0:00:00.091) 0:21:39.453 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV stripe size] ****************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:27 Friday 30 January 2026 18:58:00 -0500 (0:00:00.134) 0:21:39.588 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested stripe size] ***************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:31 Friday 30 January 2026 18:58:00 -0500 (0:00:00.054) 0:21:39.643 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected stripe size] ************************************************ task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:37 Friday 30 January 2026 18:58:00 -0500 (0:00:00.114) 0:21:39.758 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check stripe size] ******************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:42 Friday 30 January 2026 18:58:00 -0500 (0:00:00.115) 0:21:39.873 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:94 Friday 30 January 2026 18:58:00 -0500 (0:00:00.100) 0:21:39.974 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node2 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Friday 30 January 2026 18:58:00 -0500 (0:00:00.182) 0:21:40.156 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml for managed-node2 TASK [Get information about thinpool] ****************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:8 Friday 30 January 2026 18:58:00 -0500 (0:00:00.120) 0:21:40.276 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in correct thinpool (when thinp name is provided)] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:16 Friday 30 January 2026 18:58:01 -0500 (0:00:00.081) 0:21:40.357 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in thinpool (when thinp name is not provided)] ****** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:22 Friday 30 January 2026 18:58:01 -0500 (0:00:00.111) 0:21:40.468 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:26 Friday 30 January 2026 18:58:01 -0500 (0:00:00.129) 0:21:40.598 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_thin_status": null }, "changed": false } TASK [Check member encryption] ************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:97 Friday 30 January 2026 18:58:01 -0500 (0:00:00.125) 0:21:40.724 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node2 TASK [Set test variables] ****************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Friday 30 January 2026 18:58:01 -0500 (0:00:00.318) 0:21:41.043 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Friday 30 January 2026 18:58:01 -0500 (0:00:00.048) 0:21:41.091 ******** skipping: [managed-node2] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Friday 30 January 2026 18:58:02 -0500 (0:00:00.210) 0:21:41.301 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml for managed-node2 TASK [Set variables used by tests] ********************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:2 Friday 30 January 2026 18:58:02 -0500 (0:00:00.275) 0:21:41.577 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:6 Friday 30 January 2026 18:58:02 -0500 (0:00:00.178) 0:21:41.755 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:14 Friday 30 January 2026 18:58:02 -0500 (0:00:00.242) 0:21:41.997 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:23 Friday 30 January 2026 18:58:02 -0500 (0:00:00.275) 0:21:42.273 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:32 Friday 30 January 2026 18:58:03 -0500 (0:00:00.307) 0:21:42.580 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:41 Friday 30 January 2026 18:58:03 -0500 (0:00:00.109) 0:21:42.690 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [Clear test variables] **************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Friday 30 January 2026 18:58:03 -0500 (0:00:00.217) 0:21:42.908 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:100 Friday 30 January 2026 18:58:03 -0500 (0:00:00.111) 0:21:43.020 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Friday 30 January 2026 18:58:03 -0500 (0:00:00.268) 0:21:43.289 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml for managed-node2 TASK [Get information about VDO deduplication] ********************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:8 Friday 30 January 2026 18:58:04 -0500 (0:00:00.271) 0:21:43.560 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:15 Friday 30 January 2026 18:58:04 -0500 (0:00:00.208) 0:21:43.768 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:21 Friday 30 January 2026 18:58:04 -0500 (0:00:00.243) 0:21:44.011 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about VDO compression] *********************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:27 Friday 30 January 2026 18:58:04 -0500 (0:00:00.184) 0:21:44.195 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off - 2] *********************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:34 Friday 30 January 2026 18:58:05 -0500 (0:00:00.155) 0:21:44.351 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on - 2] ************************************ task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:40 Friday 30 January 2026 18:58:05 -0500 (0:00:00.041) 0:21:44.393 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:46 Friday 30 January 2026 18:58:05 -0500 (0:00:00.228) 0:21:44.621 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:103 Friday 30 January 2026 18:58:05 -0500 (0:00:00.113) 0:21:44.735 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node2 TASK [Get stratis pool information] ******************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Friday 30 January 2026 18:58:05 -0500 (0:00:00.314) 0:21:45.049 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print script output] ***************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Friday 30 January 2026 18:58:05 -0500 (0:00:00.179) 0:21:45.229 ******** skipping: [managed-node2] => {} TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:19 Friday 30 January 2026 18:58:06 -0500 (0:00:00.163) 0:21:45.393 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:23 Friday 30 January 2026 18:58:06 -0500 (0:00:00.208) 0:21:45.601 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:30 Friday 30 January 2026 18:58:06 -0500 (0:00:00.241) 0:21:45.842 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:39 Friday 30 January 2026 18:58:07 -0500 (0:00:00.514) 0:21:46.357 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:49 Friday 30 January 2026 18:58:07 -0500 (0:00:00.110) 0:21:46.467 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:106 Friday 30 January 2026 18:58:07 -0500 (0:00:00.122) 0:21:46.589 ******** ok: [managed-node2] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Friday 30 January 2026 18:58:07 -0500 (0:00:00.133) 0:21:46.723 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node2 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Friday 30 January 2026 18:58:07 -0500 (0:00:00.293) 0:21:47.016 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Friday 30 January 2026 18:58:07 -0500 (0:00:00.245) 0:21:47.261 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node2 included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node2 included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node2 included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node2 included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node2 included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node2 included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node2 included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Friday 30 January 2026 18:58:08 -0500 (0:00:00.632) 0:21:47.894 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-454a54a5-fe61-4dba-a3b5-b60deeb2f467" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Friday 30 January 2026 18:58:08 -0500 (0:00:00.165) 0:21:48.060 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Friday 30 January 2026 18:58:09 -0500 (0:00:00.276) 0:21:48.336 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Friday 30 January 2026 18:58:09 -0500 (0:00:00.242) 0:21:48.579 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Friday 30 January 2026 18:58:09 -0500 (0:00:00.162) 0:21:48.741 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Friday 30 January 2026 18:58:09 -0500 (0:00:00.179) 0:21:48.921 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Friday 30 January 2026 18:58:09 -0500 (0:00:00.200) 0:21:49.121 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Friday 30 January 2026 18:58:09 -0500 (0:00:00.167) 0:21:49.288 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Friday 30 January 2026 18:58:10 -0500 (0:00:00.112) 0:21:49.401 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Friday 30 January 2026 18:58:10 -0500 (0:00:00.124) 0:21:49.525 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Friday 30 January 2026 18:58:10 -0500 (0:00:00.087) 0:21:49.612 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Friday 30 January 2026 18:58:10 -0500 (0:00:00.108) 0:21:49.721 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-454a54a5-fe61-4dba-a3b5-b60deeb2f467 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Friday 30 January 2026 18:58:10 -0500 (0:00:00.298) 0:21:50.020 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Friday 30 January 2026 18:58:10 -0500 (0:00:00.223) 0:21:50.244 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Friday 30 January 2026 18:58:11 -0500 (0:00:00.212) 0:21:50.456 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Friday 30 January 2026 18:58:11 -0500 (0:00:00.075) 0:21:50.532 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Friday 30 January 2026 18:58:11 -0500 (0:00:00.189) 0:21:50.722 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Friday 30 January 2026 18:58:11 -0500 (0:00:00.126) 0:21:50.848 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Friday 30 January 2026 18:58:11 -0500 (0:00:00.249) 0:21:51.098 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Friday 30 January 2026 18:58:12 -0500 (0:00:00.287) 0:21:51.385 ******** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1769817452.0017705, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1769817452.0017705, "dev": 6, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 252262, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1769817452.0017705, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Friday 30 January 2026 18:58:13 -0500 (0:00:01.100) 0:21:52.486 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Friday 30 January 2026 18:58:13 -0500 (0:00:00.238) 0:21:52.724 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Friday 30 January 2026 18:58:13 -0500 (0:00:00.253) 0:21:52.978 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Friday 30 January 2026 18:58:13 -0500 (0:00:00.210) 0:21:53.189 ******** ok: [managed-node2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Friday 30 January 2026 18:58:14 -0500 (0:00:00.235) 0:21:53.424 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Friday 30 January 2026 18:58:14 -0500 (0:00:00.189) 0:21:53.613 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Friday 30 January 2026 18:58:14 -0500 (0:00:00.272) 0:21:53.886 ******** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1769817452.15277, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1769817452.15277, "dev": 6, "device_type": 64769, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 268141, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1769817452.15277, "nlink": 1, "path": "/dev/mapper/luks-454a54a5-fe61-4dba-a3b5-b60deeb2f467", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Friday 30 January 2026 18:58:16 -0500 (0:00:01.422) 0:21:55.308 ******** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Friday 30 January 2026 18:58:20 -0500 (0:00:04.113) 0:21:59.422 ******** ok: [managed-node2] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/mapper/foo-test1" ], "delta": "0:00:00.010356", "end": "2026-01-30 18:58:21.040985", "rc": 0, "start": "2026-01-30 18:58:21.030629" } STDOUT: LUKS header information Version: 2 Epoch: 3 Metadata area: 16384 [bytes] Keyslots area: 16744448 [bytes] UUID: 454a54a5-fe61-4dba-a3b5-b60deeb2f467 Label: (no label) Subsystem: (no subsystem) Flags: (no flags) Data segments: 0: crypt offset: 16777216 [bytes] length: (whole device) cipher: aes-xts-plain64 sector: 512 [bytes] Keyslots: 0: luks2 Key: 512 bits Priority: normal Cipher: aes-xts-plain64 Cipher key: 512 bits PBKDF: argon2i Time cost: 4 Memory: 909207 Threads: 2 Salt: c4 4a d1 da e5 4c 8b bf 5a bc 72 06 ba dd f4 2f d2 50 59 04 ae bb 9a ad 5b a9 49 8a b6 d7 13 a9 AF stripes: 4000 AF hash: sha256 Area offset:32768 [bytes] Area length:258048 [bytes] Digest ID: 0 Tokens: Digests: 0: pbkdf2 Hash: sha256 Iterations: 120470 Salt: 0f 78 cd 78 81 02 ad 97 b3 52 46 0f b3 5f f9 2a ab f5 02 af e8 97 85 7b a3 7a 78 98 1f b3 35 b4 Digest: e5 c5 1f e9 85 95 af 59 8a 26 58 68 48 9f 4e 91 3b 78 7a 66 8f 52 b2 5e 4a 34 d6 ff cc ea fa 62 TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Friday 30 January 2026 18:58:21 -0500 (0:00:01.143) 0:22:00.565 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Friday 30 January 2026 18:58:21 -0500 (0:00:00.196) 0:22:00.762 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Friday 30 January 2026 18:58:21 -0500 (0:00:00.302) 0:22:01.065 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Friday 30 January 2026 18:58:22 -0500 (0:00:00.236) 0:22:01.302 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Friday 30 January 2026 18:58:22 -0500 (0:00:00.278) 0:22:01.580 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Friday 30 January 2026 18:58:22 -0500 (0:00:00.421) 0:22:02.002 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Friday 30 January 2026 18:58:22 -0500 (0:00:00.252) 0:22:02.255 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Friday 30 January 2026 18:58:23 -0500 (0:00:00.346) 0:22:02.601 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-454a54a5-fe61-4dba-a3b5-b60deeb2f467 /dev/mapper/foo-test1 -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Friday 30 January 2026 18:58:23 -0500 (0:00:00.438) 0:22:03.040 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Friday 30 January 2026 18:58:23 -0500 (0:00:00.207) 0:22:03.247 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Friday 30 January 2026 18:58:24 -0500 (0:00:00.121) 0:22:03.369 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Friday 30 January 2026 18:58:24 -0500 (0:00:00.292) 0:22:03.661 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Friday 30 January 2026 18:58:24 -0500 (0:00:00.266) 0:22:03.928 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Friday 30 January 2026 18:58:24 -0500 (0:00:00.205) 0:22:04.134 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Friday 30 January 2026 18:58:25 -0500 (0:00:00.223) 0:22:04.358 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Friday 30 January 2026 18:58:25 -0500 (0:00:00.205) 0:22:04.563 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Friday 30 January 2026 18:58:25 -0500 (0:00:00.085) 0:22:04.649 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Friday 30 January 2026 18:58:25 -0500 (0:00:00.117) 0:22:04.766 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Friday 30 January 2026 18:58:25 -0500 (0:00:00.348) 0:22:05.115 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Friday 30 January 2026 18:58:26 -0500 (0:00:00.284) 0:22:05.399 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Friday 30 January 2026 18:58:26 -0500 (0:00:00.173) 0:22:05.573 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Friday 30 January 2026 18:58:26 -0500 (0:00:00.122) 0:22:05.695 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Friday 30 January 2026 18:58:26 -0500 (0:00:00.194) 0:22:05.890 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Friday 30 January 2026 18:58:26 -0500 (0:00:00.273) 0:22:06.164 ******** ok: [managed-node2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Friday 30 January 2026 18:58:27 -0500 (0:00:01.110) 0:22:07.274 ******** ok: [managed-node2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Friday 30 January 2026 18:58:29 -0500 (0:00:01.246) 0:22:08.521 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [Show expected size] ****************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Friday 30 January 2026 18:58:29 -0500 (0:00:00.431) 0:22:08.952 ******** ok: [managed-node2] => { "storage_test_expected_size": "4294967296" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Friday 30 January 2026 18:58:29 -0500 (0:00:00.283) 0:22:09.236 ******** ok: [managed-node2] => { "bytes": 10726680821, "changed": false, "lvm": "9g", "parted": "9GiB", "size": "9 GiB" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Friday 30 January 2026 18:58:31 -0500 (0:00:01.508) 0:22:10.744 ******** skipping: [managed-node2] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Friday 30 January 2026 18:58:31 -0500 (0:00:00.229) 0:22:10.973 ******** skipping: [managed-node2] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Friday 30 January 2026 18:58:31 -0500 (0:00:00.162) 0:22:11.136 ******** skipping: [managed-node2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Friday 30 January 2026 18:58:32 -0500 (0:00:00.208) 0:22:11.344 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Friday 30 January 2026 18:58:32 -0500 (0:00:00.220) 0:22:11.565 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Friday 30 January 2026 18:58:32 -0500 (0:00:00.254) 0:22:11.820 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Friday 30 January 2026 18:58:32 -0500 (0:00:00.356) 0:22:12.176 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Friday 30 January 2026 18:58:33 -0500 (0:00:00.215) 0:22:12.392 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Friday 30 January 2026 18:58:33 -0500 (0:00:00.181) 0:22:12.573 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Friday 30 January 2026 18:58:33 -0500 (0:00:00.181) 0:22:12.755 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Friday 30 January 2026 18:58:33 -0500 (0:00:00.233) 0:22:12.988 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Friday 30 January 2026 18:58:33 -0500 (0:00:00.211) 0:22:13.199 ******** skipping: [managed-node2] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Friday 30 January 2026 18:58:34 -0500 (0:00:00.157) 0:22:13.357 ******** skipping: [managed-node2] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Friday 30 January 2026 18:58:34 -0500 (0:00:00.245) 0:22:13.603 ******** skipping: [managed-node2] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Friday 30 January 2026 18:58:34 -0500 (0:00:00.175) 0:22:13.778 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Friday 30 January 2026 18:58:34 -0500 (0:00:00.267) 0:22:14.045 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Friday 30 January 2026 18:58:34 -0500 (0:00:00.162) 0:22:14.208 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Friday 30 January 2026 18:58:35 -0500 (0:00:00.181) 0:22:14.390 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Friday 30 January 2026 18:58:35 -0500 (0:00:00.245) 0:22:14.635 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Friday 30 January 2026 18:58:35 -0500 (0:00:00.194) 0:22:14.830 ******** ok: [managed-node2] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Friday 30 January 2026 18:58:35 -0500 (0:00:00.198) 0:22:15.028 ******** ok: [managed-node2] => { "storage_test_expected_size": "4294967296" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Friday 30 January 2026 18:58:35 -0500 (0:00:00.203) 0:22:15.232 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Friday 30 January 2026 18:58:36 -0500 (0:00:00.251) 0:22:15.484 ******** ok: [managed-node2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.024075", "end": "2026-01-30 18:58:37.426278", "rc": 0, "start": "2026-01-30 18:58:37.402203" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Friday 30 January 2026 18:58:37 -0500 (0:00:01.512) 0:22:16.997 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [Check segment type] ****************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Friday 30 January 2026 18:58:37 -0500 (0:00:00.226) 0:22:17.223 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Friday 30 January 2026 18:58:38 -0500 (0:00:00.355) 0:22:17.578 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Friday 30 January 2026 18:58:38 -0500 (0:00:00.212) 0:22:17.791 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Friday 30 January 2026 18:58:38 -0500 (0:00:00.128) 0:22:17.919 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Friday 30 January 2026 18:58:38 -0500 (0:00:00.177) 0:22:18.096 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Friday 30 January 2026 18:58:38 -0500 (0:00:00.181) 0:22:18.278 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Friday 30 January 2026 18:58:39 -0500 (0:00:00.286) 0:22:18.564 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Friday 30 January 2026 18:58:39 -0500 (0:00:00.143) 0:22:18.707 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Clean up] **************************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:527 Friday 30 January 2026 18:58:39 -0500 (0:00:00.262) 0:22:18.970 ******** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Friday 30 January 2026 18:58:40 -0500 (0:00:00.905) 0:22:19.876 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Friday 30 January 2026 18:58:40 -0500 (0:00:00.217) 0:22:20.093 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Friday 30 January 2026 18:58:41 -0500 (0:00:00.239) 0:22:20.333 ******** skipping: [managed-node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node2] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node2] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Friday 30 January 2026 18:58:41 -0500 (0:00:00.382) 0:22:20.715 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Friday 30 January 2026 18:58:41 -0500 (0:00:00.243) 0:22:20.959 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Friday 30 January 2026 18:58:41 -0500 (0:00:00.258) 0:22:21.218 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Friday 30 January 2026 18:58:42 -0500 (0:00:00.133) 0:22:21.352 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Friday 30 January 2026 18:58:42 -0500 (0:00:00.203) 0:22:21.556 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Friday 30 January 2026 18:58:42 -0500 (0:00:00.547) 0:22:22.103 ******** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Friday 30 January 2026 18:58:46 -0500 (0:00:04.148) 0:22:26.252 ******** ok: [managed-node2] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Friday 30 January 2026 18:58:47 -0500 (0:00:00.415) 0:22:26.668 ******** ok: [managed-node2] => { "storage_volumes | d([])": [ { "disks": [ "sda" ], "name": "foo", "state": "absent", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Friday 30 January 2026 18:58:47 -0500 (0:00:00.233) 0:22:26.901 ******** ok: [managed-node2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Friday 30 January 2026 18:58:53 -0500 (0:00:05.522) 0:22:32.424 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Friday 30 January 2026 18:58:53 -0500 (0:00:00.394) 0:22:32.818 ******** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Friday 30 January 2026 18:58:53 -0500 (0:00:00.176) 0:22:32.995 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Friday 30 January 2026 18:58:54 -0500 (0:00:00.384) 0:22:33.379 ******** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Friday 30 January 2026 18:58:54 -0500 (0:00:00.200) 0:22:33.580 ******** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Friday 30 January 2026 18:58:58 -0500 (0:00:04.656) 0:22:38.237 ******** ok: [managed-node2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Friday 30 January 2026 18:59:01 -0500 (0:00:03.044) 0:22:41.281 ******** ok: [managed-node2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Friday 30 January 2026 18:59:02 -0500 (0:00:00.381) 0:22:41.663 ******** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Friday 30 January 2026 18:59:02 -0500 (0:00:00.175) 0:22:41.839 ******** changed: [managed-node2] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-454a54a5-fe61-4dba-a3b5-b60deeb2f467", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-454a54a5-fe61-4dba-a3b5-b60deeb2f467", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "destroy device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "lvmpv" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-454a54a5-fe61-4dba-a3b5-b60deeb2f467", "password": "-", "state": "absent" } ], "leaves": [ "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-454a54a5-fe61-4dba-a3b5-b60deeb2f467", "state": "absent" } ], "packages": [ "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_mount_id": "UUID=iWbLf2-pI6s-fAVq-vldH-x42y-4tNs-WFOSje", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "lvmpv", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "absent", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 Friday 30 January 2026 18:59:08 -0500 (0:00:05.953) 0:22:47.792 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Friday 30 January 2026 18:59:08 -0500 (0:00:00.214) 0:22:48.006 ******** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1769817460.0257502, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "30975c1ff57be573b5c3006b159d430ae2f44a90", "ctime": 1769817460.0227501, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 322961545, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1769817460.0227501, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1425, "uid": 0, "version": "3166550135", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:97 Friday 30 January 2026 18:59:10 -0500 (0:00:01.333) 0:22:49.340 ******** ok: [managed-node2] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Friday 30 January 2026 18:59:11 -0500 (0:00:01.349) 0:22:50.690 ******** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Friday 30 January 2026 18:59:11 -0500 (0:00:00.192) 0:22:50.882 ******** ok: [managed-node2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-454a54a5-fe61-4dba-a3b5-b60deeb2f467", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-454a54a5-fe61-4dba-a3b5-b60deeb2f467", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "destroy device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "lvmpv" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-454a54a5-fe61-4dba-a3b5-b60deeb2f467", "password": "-", "state": "absent" } ], "failed": false, "leaves": [ "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-454a54a5-fe61-4dba-a3b5-b60deeb2f467", "state": "absent" } ], "packages": [ "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_mount_id": "UUID=iWbLf2-pI6s-fAVq-vldH-x42y-4tNs-WFOSje", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "lvmpv", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "absent", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:130 Friday 30 January 2026 18:59:11 -0500 (0:00:00.211) 0:22:51.093 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:134 Friday 30 January 2026 18:59:12 -0500 (0:00:00.323) 0:22:51.417 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/sda", "_mount_id": "UUID=iWbLf2-pI6s-fAVq-vldH-x42y-4tNs-WFOSje", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "lvmpv", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "absent", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:150 Friday 30 January 2026 18:59:12 -0500 (0:00:00.285) 0:22:51.702 ******** changed: [managed-node2] => (item={'src': '/dev/mapper/luks-454a54a5-fe61-4dba-a3b5-b60deeb2f467', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-454a54a5-fe61-4dba-a3b5-b60deeb2f467", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-454a54a5-fe61-4dba-a3b5-b60deeb2f467" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Friday 30 January 2026 18:59:14 -0500 (0:00:01.604) 0:22:53.306 ******** ok: [managed-node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166 Friday 30 January 2026 18:59:16 -0500 (0:00:02.266) 0:22:55.573 ******** TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:177 Friday 30 January 2026 18:59:16 -0500 (0:00:00.337) 0:22:55.910 ******** TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:189 Friday 30 January 2026 18:59:16 -0500 (0:00:00.213) 0:22:56.124 ******** ok: [managed-node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 Friday 30 January 2026 18:59:18 -0500 (0:00:01.804) 0:22:57.929 ******** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1769817470.9607224, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "aa853f373914987081ac7ef8d9f6ce93dedd72ea", "ctime": 1769817464.5487387, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 283116162, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1769817464.5477386, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 66, "uid": 0, "version": "1514363400", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:202 Friday 30 January 2026 18:59:20 -0500 (0:00:01.710) 0:22:59.639 ******** changed: [managed-node2] => (item={'backing_device': '/dev/mapper/foo-test1', 'name': 'luks-454a54a5-fe61-4dba-a3b5-b60deeb2f467', 'password': '-', 'state': 'absent'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/mapper/foo-test1", "name": "luks-454a54a5-fe61-4dba-a3b5-b60deeb2f467", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 Friday 30 January 2026 18:59:22 -0500 (0:00:01.792) 0:23:01.431 ******** ok: [managed-node2] TASK [Verify role results - 11] ************************************************ task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:537 Friday 30 January 2026 18:59:24 -0500 (0:00:02.126) 0:23:03.558 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node2 TASK [Print out pool information] ********************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Friday 30 January 2026 18:59:24 -0500 (0:00:00.638) 0:23:04.196 ******** skipping: [managed-node2] => {} TASK [Print out volume information] ******************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Friday 30 January 2026 18:59:25 -0500 (0:00:00.317) 0:23:04.513 ******** ok: [managed-node2] => { "_storage_volumes_list": [ { "_device": "/dev/sda", "_mount_id": "UUID=iWbLf2-pI6s-fAVq-vldH-x42y-4tNs-WFOSje", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "lvmpv", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "absent", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Friday 30 January 2026 18:59:25 -0500 (0:00:00.320) 0:23:04.834 ******** ok: [managed-node2] => { "changed": false, "info": { "/dev/sda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Friday 30 January 2026 18:59:27 -0500 (0:00:01.569) 0:23:06.403 ******** ok: [managed-node2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002552", "end": "2026-01-30 18:59:28.370456", "rc": 0, "start": "2026-01-30 18:59:28.367904" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Friday 30 January 2026 18:59:28 -0500 (0:00:01.500) 0:23:07.904 ******** ok: [managed-node2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002565", "end": "2026-01-30 18:59:29.638612", "failed_when_result": false, "rc": 0, "start": "2026-01-30 18:59:29.636047" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Friday 30 January 2026 18:59:29 -0500 (0:00:01.253) 0:23:09.158 ******** TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Friday 30 January 2026 18:59:30 -0500 (0:00:00.168) 0:23:09.326 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node2 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Friday 30 January 2026 18:59:30 -0500 (0:00:00.301) 0:23:09.628 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_volume_present": false, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Friday 30 January 2026 18:59:30 -0500 (0:00:00.229) 0:23:09.858 ******** included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node2 included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node2 included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node2 included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node2 included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node2 included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node2 included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node2 included: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Friday 30 January 2026 18:59:31 -0500 (0:00:01.055) 0:23:10.913 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_device_path": "/dev/sda" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Friday 30 January 2026 18:59:31 -0500 (0:00:00.317) 0:23:11.231 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Friday 30 January 2026 18:59:32 -0500 (0:00:01.021) 0:23:12.252 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Friday 30 January 2026 18:59:33 -0500 (0:00:00.202) 0:23:12.454 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Friday 30 January 2026 18:59:33 -0500 (0:00:00.154) 0:23:12.609 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Friday 30 January 2026 18:59:33 -0500 (0:00:00.175) 0:23:12.784 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Friday 30 January 2026 18:59:33 -0500 (0:00:00.272) 0:23:13.057 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Friday 30 January 2026 18:59:33 -0500 (0:00:00.213) 0:23:13.271 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Friday 30 January 2026 18:59:34 -0500 (0:00:00.280) 0:23:13.551 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Friday 30 January 2026 18:59:34 -0500 (0:00:00.169) 0:23:13.721 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Friday 30 January 2026 18:59:34 -0500 (0:00:00.272) 0:23:13.994 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Friday 30 January 2026 18:59:34 -0500 (0:00:00.162) 0:23:14.156 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "0", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Friday 30 January 2026 18:59:35 -0500 (0:00:00.482) 0:23:14.639 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Friday 30 January 2026 18:59:35 -0500 (0:00:00.206) 0:23:14.845 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Friday 30 January 2026 18:59:35 -0500 (0:00:00.197) 0:23:15.042 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Friday 30 January 2026 18:59:35 -0500 (0:00:00.187) 0:23:15.230 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Friday 30 January 2026 18:59:36 -0500 (0:00:00.200) 0:23:15.430 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Friday 30 January 2026 18:59:36 -0500 (0:00:00.205) 0:23:15.635 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fs label] ********************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Friday 30 January 2026 18:59:36 -0500 (0:00:00.220) 0:23:15.856 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [See whether the device node is present] ********************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Friday 30 January 2026 18:59:36 -0500 (0:00:00.172) 0:23:16.028 ******** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1769817548.1365266, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1769817548.1365266, "dev": 6, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 35654, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1769817548.1365266, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Friday 30 January 2026 18:59:37 -0500 (0:00:01.002) 0:23:17.031 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Friday 30 January 2026 18:59:37 -0500 (0:00:00.203) 0:23:17.234 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Friday 30 January 2026 18:59:38 -0500 (0:00:00.216) 0:23:17.450 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Friday 30 January 2026 18:59:38 -0500 (0:00:00.094) 0:23:17.545 ******** ok: [managed-node2] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Friday 30 January 2026 18:59:38 -0500 (0:00:00.201) 0:23:17.746 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Friday 30 January 2026 18:59:38 -0500 (0:00:00.314) 0:23:18.060 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Friday 30 January 2026 18:59:38 -0500 (0:00:00.198) 0:23:18.259 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Friday 30 January 2026 18:59:39 -0500 (0:00:00.170) 0:23:18.429 ******** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Friday 30 January 2026 18:59:43 -0500 (0:00:04.353) 0:23:22.783 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Friday 30 January 2026 18:59:43 -0500 (0:00:00.189) 0:23:22.972 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Friday 30 January 2026 18:59:43 -0500 (0:00:00.233) 0:23:23.205 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Friday 30 January 2026 18:59:44 -0500 (0:00:00.189) 0:23:23.395 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Friday 30 January 2026 18:59:44 -0500 (0:00:00.173) 0:23:23.569 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Friday 30 January 2026 18:59:44 -0500 (0:00:00.172) 0:23:23.742 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Friday 30 January 2026 18:59:44 -0500 (0:00:00.125) 0:23:23.868 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Friday 30 January 2026 18:59:44 -0500 (0:00:00.174) 0:23:24.043 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Friday 30 January 2026 18:59:44 -0500 (0:00:00.095) 0:23:24.138 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Friday 30 January 2026 18:59:45 -0500 (0:00:00.184) 0:23:24.322 ******** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Friday 30 January 2026 18:59:45 -0500 (0:00:00.161) 0:23:24.483 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Friday 30 January 2026 18:59:45 -0500 (0:00:00.179) 0:23:24.663 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Friday 30 January 2026 18:59:45 -0500 (0:00:00.165) 0:23:24.829 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Friday 30 January 2026 18:59:45 -0500 (0:00:00.181) 0:23:25.011 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Friday 30 January 2026 18:59:45 -0500 (0:00:00.180) 0:23:25.191 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Friday 30 January 2026 18:59:46 -0500 (0:00:00.361) 0:23:25.553 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Friday 30 January 2026 18:59:46 -0500 (0:00:00.135) 0:23:25.688 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Friday 30 January 2026 18:59:46 -0500 (0:00:00.173) 0:23:25.862 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Friday 30 January 2026 18:59:46 -0500 (0:00:00.213) 0:23:26.075 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Friday 30 January 2026 18:59:46 -0500 (0:00:00.212) 0:23:26.288 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Friday 30 January 2026 18:59:47 -0500 (0:00:00.186) 0:23:26.474 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Friday 30 January 2026 18:59:47 -0500 (0:00:00.229) 0:23:26.704 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Friday 30 January 2026 18:59:47 -0500 (0:00:00.221) 0:23:26.926 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Friday 30 January 2026 18:59:47 -0500 (0:00:00.107) 0:23:27.033 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Friday 30 January 2026 18:59:47 -0500 (0:00:00.185) 0:23:27.218 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Friday 30 January 2026 18:59:48 -0500 (0:00:00.200) 0:23:27.419 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Friday 30 January 2026 18:59:48 -0500 (0:00:00.166) 0:23:27.585 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Friday 30 January 2026 18:59:48 -0500 (0:00:00.202) 0:23:27.788 ******** ok: [managed-node2] => { "storage_test_expected_size": "4294967296" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Friday 30 January 2026 18:59:48 -0500 (0:00:00.307) 0:23:28.096 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Friday 30 January 2026 18:59:48 -0500 (0:00:00.178) 0:23:28.275 ******** skipping: [managed-node2] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Friday 30 January 2026 18:59:49 -0500 (0:00:00.238) 0:23:28.513 ******** skipping: [managed-node2] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Friday 30 January 2026 18:59:49 -0500 (0:00:00.202) 0:23:28.716 ******** skipping: [managed-node2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Friday 30 January 2026 18:59:49 -0500 (0:00:00.201) 0:23:28.918 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Friday 30 January 2026 18:59:49 -0500 (0:00:00.249) 0:23:29.168 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Friday 30 January 2026 18:59:50 -0500 (0:00:00.219) 0:23:29.388 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Friday 30 January 2026 18:59:50 -0500 (0:00:00.231) 0:23:29.619 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Friday 30 January 2026 18:59:50 -0500 (0:00:00.243) 0:23:29.863 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Friday 30 January 2026 18:59:50 -0500 (0:00:00.247) 0:23:30.111 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Friday 30 January 2026 18:59:51 -0500 (0:00:00.239) 0:23:30.350 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Friday 30 January 2026 18:59:51 -0500 (0:00:00.274) 0:23:30.625 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Friday 30 January 2026 18:59:51 -0500 (0:00:00.229) 0:23:30.854 ******** skipping: [managed-node2] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Friday 30 January 2026 18:59:51 -0500 (0:00:00.151) 0:23:31.006 ******** skipping: [managed-node2] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Friday 30 January 2026 18:59:51 -0500 (0:00:00.235) 0:23:31.242 ******** skipping: [managed-node2] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Friday 30 January 2026 18:59:52 -0500 (0:00:00.149) 0:23:31.417 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Friday 30 January 2026 18:59:52 -0500 (0:00:00.176) 0:23:31.593 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Friday 30 January 2026 18:59:52 -0500 (0:00:00.153) 0:23:31.746 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Friday 30 January 2026 18:59:52 -0500 (0:00:00.119) 0:23:31.866 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Friday 30 January 2026 18:59:52 -0500 (0:00:00.163) 0:23:32.030 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Friday 30 January 2026 18:59:52 -0500 (0:00:00.167) 0:23:32.197 ******** ok: [managed-node2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Friday 30 January 2026 18:59:53 -0500 (0:00:00.156) 0:23:32.353 ******** ok: [managed-node2] => { "storage_test_expected_size": "4294967296" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Friday 30 January 2026 18:59:53 -0500 (0:00:00.120) 0:23:32.474 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Friday 30 January 2026 18:59:53 -0500 (0:00:00.255) 0:23:32.730 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Friday 30 January 2026 18:59:53 -0500 (0:00:00.205) 0:23:32.935 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Friday 30 January 2026 18:59:53 -0500 (0:00:00.193) 0:23:33.128 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Friday 30 January 2026 18:59:54 -0500 (0:00:00.199) 0:23:33.328 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Friday 30 January 2026 18:59:54 -0500 (0:00:00.212) 0:23:33.540 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Friday 30 January 2026 18:59:54 -0500 (0:00:00.244) 0:23:33.785 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Friday 30 January 2026 18:59:54 -0500 (0:00:00.179) 0:23:33.965 ******** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Friday 30 January 2026 18:59:54 -0500 (0:00:00.174) 0:23:34.139 ******** ok: [managed-node2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Friday 30 January 2026 18:59:54 -0500 (0:00:00.126) 0:23:34.265 ******** ok: [managed-node2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* managed-node2 : ok=1229 changed=60 unreachable=0 failed=9 skipped=1068 rescued=9 ignored=0 SYSTEM ROLES ERRORS BEGIN v1 [ { "ansible_version": "2.9.27", "end_time": "2026-01-30T23:37:38.318562+00:00Z", "host": "managed-node2", "message": "encrypted volume 'foo' missing key/password", "start_time": "2026-01-30T23:37:32.882139+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70" }, { "ansible_version": "2.9.27", "end_time": "2026-01-30T23:37:38.505967+00:00Z", "host": "managed-node2", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [], "safe_mode": true, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } }, "leaves": [], "mounts": [], "msg": "encrypted volume 'foo' missing key/password", "packages": [], "pools": [], "volumes": [] }, "start_time": "2026-01-30T23:37:38.350576+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111" }, { "ansible_version": "2.9.27", "end_time": "2026-01-30T23:39:34.018418+00:00Z", "host": "managed-node2", "message": "cannot remove existing formatting on device 'luks-63563baa-e47a-4110-ace5-a19589f17bad' in safe mode due to encryption removal", "start_time": "2026-01-30T23:39:28.750090+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70" }, { "ansible_version": "2.9.27", "end_time": "2026-01-30T23:39:34.211520+00:00Z", "host": "managed-node2", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [], "safe_mode": true, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10720641024, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } }, "leaves": [], "mounts": [], "msg": "cannot remove existing formatting on device 'luks-63563baa-e47a-4110-ace5-a19589f17bad' in safe mode due to encryption removal", "packages": [], "pools": [], "volumes": [] }, "start_time": "2026-01-30T23:39:34.025541+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111" }, { "ansible_version": "2.9.27", "end_time": "2026-01-30T23:41:19.313230+00:00Z", "host": "managed-node2", "message": "cannot remove existing formatting on device 'sda' in safe mode due to adding encryption", "start_time": "2026-01-30T23:41:14.123793+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70" }, { "ansible_version": "2.9.27", "end_time": "2026-01-30T23:41:19.510274+00:00Z", "host": "managed-node2", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [], "safe_mode": true, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } }, "leaves": [], "mounts": [], "msg": "cannot remove existing formatting on device 'sda' in safe mode due to adding encryption", "packages": [], "pools": [], "volumes": [] }, "start_time": "2026-01-30T23:41:19.338787+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111" }, { "ansible_version": "2.9.27", "end_time": "2026-01-30T23:43:09.865395+00:00Z", "host": "managed-node2", "message": "encrypted volume 'test1' missing key/password", "start_time": "2026-01-30T23:43:04.774233+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70" }, { "ansible_version": "2.9.27", "end_time": "2026-01-30T23:43:10.069816+00:00Z", "host": "managed-node2", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "safe_mode": false, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [] } }, "leaves": [], "mounts": [], "msg": "encrypted volume 'test1' missing key/password", "packages": [], "pools": [], "volumes": [] }, "start_time": "2026-01-30T23:43:09.891026+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111" }, { "ansible_version": "2.9.27", "end_time": "2026-01-30T23:45:25.089358+00:00Z", "host": "managed-node2", "message": "cannot remove existing formatting on device 'luks-e6b9be7c-8967-4afd-8c12-6d7654b37892' in safe mode due to encryption removal", "start_time": "2026-01-30T23:45:19.248151+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70" }, { "ansible_version": "2.9.27", "end_time": "2026-01-30T23:45:25.410392+00:00Z", "host": "managed-node2", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "safe_mode": true, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [] } }, "leaves": [], "mounts": [], "msg": "cannot remove existing formatting on device 'luks-e6b9be7c-8967-4afd-8c12-6d7654b37892' in safe mode due to encryption removal", "packages": [], "pools": [], "volumes": [] }, "start_time": "2026-01-30T23:45:25.109256+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111" }, { "ansible_version": "2.9.27", "end_time": "2026-01-30T23:47:38.881089+00:00Z", "host": "managed-node2", "message": "cannot remove existing formatting on device 'sda1' in safe mode due to adding encryption", "start_time": "2026-01-30T23:47:33.097536+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70" }, { "ansible_version": "2.9.27", "end_time": "2026-01-30T23:47:38.996778+00:00Z", "host": "managed-node2", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "safe_mode": true, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [] } }, "leaves": [], "mounts": [], "msg": "cannot remove existing formatting on device 'sda1' in safe mode due to adding encryption", "packages": [], "pools": [], "volumes": [] }, "start_time": "2026-01-30T23:47:38.889132+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111" }, { "ansible_version": "2.9.27", "end_time": "2026-01-30T23:49:56.466368+00:00Z", "host": "managed-node2", "message": "encrypted volume 'test1' missing key/password", "start_time": "2026-01-30T23:49:50.690078+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70" }, { "ansible_version": "2.9.27", "end_time": "2026-01-30T23:49:56.813471+00:00Z", "host": "managed-node2", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "safe_mode": false, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [] } }, "leaves": [], "mounts": [], "msg": "encrypted volume 'test1' missing key/password", "packages": [], "pools": [], "volumes": [] }, "start_time": "2026-01-30T23:49:56.473522+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111" }, { "ansible_version": "2.9.27", "end_time": "2026-01-30T23:54:25.380071+00:00Z", "host": "managed-node2", "message": "cannot remove existing formatting on device 'luks-c3cb5b2a-dd54-432f-b831-c342bb741a6d' in safe mode due to encryption removal", "start_time": "2026-01-30T23:54:19.773443+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70" }, { "ansible_version": "2.9.27", "end_time": "2026-01-30T23:54:25.549213+00:00Z", "host": "managed-node2", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "safe_mode": true, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [] } }, "leaves": [], "mounts": [], "msg": "cannot remove existing formatting on device 'luks-c3cb5b2a-dd54-432f-b831-c342bb741a6d' in safe mode due to encryption removal", "packages": [], "pools": [], "volumes": [] }, "start_time": "2026-01-30T23:54:25.413580+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111" }, { "ansible_version": "2.9.27", "end_time": "2026-01-30T23:56:51.290815+00:00Z", "host": "managed-node2", "message": "cannot remove existing formatting on device 'test1' in safe mode due to adding encryption", "start_time": "2026-01-30T23:56:45.282596+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70" }, { "ansible_version": "2.9.27", "end_time": "2026-01-30T23:56:51.621109+00:00Z", "host": "managed-node2", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "safe_mode": true, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [] } }, "leaves": [], "mounts": [], "msg": "cannot remove existing formatting on device 'test1' in safe mode due to adding encryption", "packages": [], "pools": [], "volumes": [] }, "start_time": "2026-01-30T23:56:51.326462+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111" } ] SYSTEM ROLES ERRORS END v1 TASKS RECAP ******************************************************************** Friday 30 January 2026 18:59:55 -0500 (0:00:00.275) 0:23:34.541 ******** =============================================================================== fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 14.42s /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 13.81s /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 13.78s /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 13.61s /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 13.58s /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 13.35s /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 fedora.linux_system_roles.storage : Make sure blivet is available ------- 6.22s /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 fedora.linux_system_roles.storage : Get required packages --------------- 6.06s /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 6.04s /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 6.03s /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 6.02s /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 5.95s /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 fedora.linux_system_roles.storage : Get required packages --------------- 5.90s /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 5.85s /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 fedora.linux_system_roles.storage : Get required packages --------------- 5.81s /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 5.79s /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 5.78s /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 5.64s /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 5.59s /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 fedora.linux_system_roles.storage : Get required packages --------------- 5.57s /tmp/collections-3rv/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19