ansible-playbook 2.9.27 config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.9/site-packages/ansible executable location = /usr/local/bin/ansible-playbook python version = 3.9.19 (main, May 16 2024, 11:40:09) [GCC 8.5.0 20210514 (Red Hat 8.5.0-22)] No config file found; using defaults [WARNING]: running playbook inside collection fedora.linux_system_roles statically imported: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml statically imported: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml statically imported: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml statically imported: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml statically imported: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml statically imported: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'jsonl', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: tests_luks.yml ******************************************************* 1 plays in /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml PLAY [Test LUKS] *************************************************************** TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:2 Saturday 21 March 2026 18:56:49 -0400 (0:00:00.375) 0:00:00.375 ******** ok: [managed-node9] META: ran handlers TASK [Enable FIPS mode] ******************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:20 Saturday 21 March 2026 18:56:53 -0400 (0:00:03.940) 0:00:04.316 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reboot] ****************************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:28 Saturday 21 March 2026 18:56:53 -0400 (0:00:00.406) 0:00:04.722 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Enable FIPS mode - 2] **************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:39 Saturday 21 March 2026 18:56:54 -0400 (0:00:00.452) 0:00:05.174 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reboot - 2] ************************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:43 Saturday 21 March 2026 18:56:54 -0400 (0:00:00.359) 0:00:05.534 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure dracut-fips] ****************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:53 Saturday 21 March 2026 18:56:54 -0400 (0:00:00.280) 0:00:05.815 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Configure boot for FIPS] ************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:59 Saturday 21 March 2026 18:56:55 -0400 (0:00:00.322) 0:00:06.138 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reboot - 3] ************************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:68 Saturday 21 March 2026 18:56:55 -0400 (0:00:00.309) 0:00:06.448 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role] ************************************************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:72 Saturday 21 March 2026 18:56:56 -0400 (0:00:00.617) 0:00:07.065 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node9 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:24 Saturday 21 March 2026 18:56:56 -0400 (0:00:00.195) 0:00:07.260 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:34 Saturday 21 March 2026 18:56:56 -0400 (0:00:00.373) 0:00:07.634 ******** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Saturday 21 March 2026 18:56:57 -0400 (0:00:00.784) 0:00:08.418 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Saturday 21 March 2026 18:56:57 -0400 (0:00:00.181) 0:00:08.600 ******** ok: [managed-node9] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Saturday 21 March 2026 18:57:00 -0400 (0:00:02.479) 0:00:11.079 ******** skipping: [managed-node9] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node9] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node9] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node9] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Saturday 21 March 2026 18:57:00 -0400 (0:00:00.628) 0:00:11.708 ******** ok: [managed-node9] => { "changed": false, "stat": { "exists": false } } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Saturday 21 March 2026 18:57:03 -0400 (0:00:02.980) 0:00:14.688 ******** ok: [managed-node9] => { "ansible_facts": { "__storage_is_ostree": false }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Saturday 21 March 2026 18:57:04 -0400 (0:00:00.218) 0:00:14.907 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Saturday 21 March 2026 18:57:04 -0400 (0:00:00.155) 0:00:15.063 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Saturday 21 March 2026 18:57:04 -0400 (0:00:00.189) 0:00:15.252 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Saturday 21 March 2026 18:57:05 -0400 (0:00:00.932) 0:00:16.185 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Saturday 21 March 2026 18:57:05 -0400 (0:00:00.205) 0:00:16.390 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Saturday 21 March 2026 18:57:05 -0400 (0:00:00.203) 0:00:16.593 ******** ok: [managed-node9] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kmod-kvdo libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python3-blivet stratis-cli stratisd vdo xfsprogs TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Saturday 21 March 2026 18:57:10 -0400 (0:00:05.206) 0:00:21.800 ******** ok: [managed-node9] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Saturday 21 March 2026 18:57:11 -0400 (0:00:00.167) 0:00:21.967 ******** ok: [managed-node9] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Saturday 21 March 2026 18:57:11 -0400 (0:00:00.131) 0:00:22.099 ******** ok: [managed-node9] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Saturday 21 March 2026 18:57:13 -0400 (0:00:02.715) 0:00:24.814 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Saturday 21 March 2026 18:57:14 -0400 (0:00:00.238) 0:00:25.053 ******** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Saturday 21 March 2026 18:57:14 -0400 (0:00:00.115) 0:00:25.169 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Saturday 21 March 2026 18:57:14 -0400 (0:00:00.197) 0:00:25.367 ******** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Saturday 21 March 2026 18:57:14 -0400 (0:00:00.094) 0:00:25.462 ******** ok: [managed-node9] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kpartx TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Saturday 21 March 2026 18:57:18 -0400 (0:00:03.865) 0:00:29.328 ******** ok: [managed-node9] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Saturday 21 March 2026 18:57:21 -0400 (0:00:02.995) 0:00:32.323 ******** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Saturday 21 March 2026 18:57:21 -0400 (0:00:00.334) 0:00:32.658 ******** ok: [managed-node9] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:103 Saturday 21 March 2026 18:57:23 -0400 (0:00:01.614) 0:00:34.272 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:110 Saturday 21 March 2026 18:57:23 -0400 (0:00:00.252) 0:00:34.524 ******** ok: [managed-node9] => { "changed": false, "stat": { "atime": 1774133756.6355941, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "ab8070345774adad92683e9645714452be7be474", "ctime": 1774133754.8795838, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 364904581, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1774133754.8795838, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1343, "uid": 0, "version": "133446622", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Saturday 21 March 2026 18:57:24 -0400 (0:00:01.286) 0:00:35.811 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Saturday 21 March 2026 18:57:25 -0400 (0:00:00.187) 0:00:35.998 ******** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:139 Saturday 21 March 2026 18:57:25 -0400 (0:00:00.288) 0:00:36.287 ******** ok: [managed-node9] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Saturday 21 March 2026 18:57:25 -0400 (0:00:00.116) 0:00:36.404 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:152 Saturday 21 March 2026 18:57:25 -0400 (0:00:00.128) 0:00:36.532 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:168 Saturday 21 March 2026 18:57:25 -0400 (0:00:00.118) 0:00:36.651 ******** TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:179 Saturday 21 March 2026 18:57:25 -0400 (0:00:00.191) 0:00:36.842 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:184 Saturday 21 March 2026 18:57:26 -0400 (0:00:00.199) 0:00:37.041 ******** TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Saturday 21 March 2026 18:57:26 -0400 (0:00:00.156) 0:00:37.198 ******** TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:207 Saturday 21 March 2026 18:57:26 -0400 (0:00:00.179) 0:00:37.377 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:215 Saturday 21 March 2026 18:57:26 -0400 (0:00:00.206) 0:00:37.584 ******** ok: [managed-node9] => { "changed": false, "stat": { "atime": 1774132749.8346348, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1716968941.893, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 135, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1716968586.525, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "1157759751", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:220 Saturday 21 March 2026 18:57:28 -0400 (0:00:01.433) 0:00:39.017 ******** TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:242 Saturday 21 March 2026 18:57:28 -0400 (0:00:00.187) 0:00:39.205 ******** ok: [managed-node9] TASK [Get unused disks] ******************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:75 Saturday 21 March 2026 18:57:29 -0400 (0:00:01.552) 0:00:40.757 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml for managed-node9 TASK [Ensure test packages] **************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:2 Saturday 21 March 2026 18:57:30 -0400 (0:00:00.162) 0:00:40.919 ******** ok: [managed-node9] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: util-linux TASK [Find unused disks in the system] ***************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:11 Saturday 21 March 2026 18:57:34 -0400 (0:00:04.418) 0:00:45.338 ******** ok: [managed-node9] => { "changed": false, "disks": [ "sda" ], "info": [ "Line: NAME=\"/dev/sda\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdb\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdc\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdd\" TYPE=\"disk\" SIZE=\"1099511627776\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sde\" TYPE=\"disk\" SIZE=\"1099511627776\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdf\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdg\" TYPE=\"disk\" SIZE=\"1099511627776\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdh\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdi\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/xvda\" TYPE=\"disk\" SIZE=\"268435456000\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/xvda1\" TYPE=\"part\" SIZE=\"268434390528\" FSTYPE=\"xfs\" LOG-SEC=\"512\"", "Line type [part] is not disk: NAME=\"/dev/xvda1\" TYPE=\"part\" SIZE=\"268434390528\" FSTYPE=\"xfs\" LOG-SEC=\"512\"", "filename [xvda1] is a partition", "Disk [/dev/xvda] attrs [{'type': 'disk', 'size': '268435456000', 'fstype': '', 'ssize': '512'}] has partitions" ] } TASK [Debug why there are no unused disks] ************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:20 Saturday 21 March 2026 18:57:36 -0400 (0:00:02.441) 0:00:47.779 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set unused_disks if necessary] ******************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:29 Saturday 21 March 2026 18:57:37 -0400 (0:00:00.169) 0:00:47.949 ******** ok: [managed-node9] => { "ansible_facts": { "unused_disks": [ "sda" ] }, "changed": false } TASK [Exit playbook when there's not enough unused disks in the system] ******** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:34 Saturday 21 March 2026 18:57:37 -0400 (0:00:00.181) 0:00:48.131 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print unused disks] ****************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:39 Saturday 21 March 2026 18:57:37 -0400 (0:00:00.151) 0:00:48.282 ******** ok: [managed-node9] => { "unused_disks": [ "sda" ] } TASK [Test for correct handling of new encrypted volume w/ no key] ************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:84 Saturday 21 March 2026 18:57:37 -0400 (0:00:00.091) 0:00:48.373 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node9 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Saturday 21 March 2026 18:57:37 -0400 (0:00:00.176) 0:00:48.550 ******** ok: [managed-node9] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Saturday 21 March 2026 18:57:37 -0400 (0:00:00.155) 0:00:48.705 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node9 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:24 Saturday 21 March 2026 18:57:38 -0400 (0:00:00.272) 0:00:48.977 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:34 Saturday 21 March 2026 18:57:38 -0400 (0:00:00.167) 0:00:49.145 ******** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Saturday 21 March 2026 18:57:38 -0400 (0:00:00.187) 0:00:49.332 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Saturday 21 March 2026 18:57:38 -0400 (0:00:00.190) 0:00:49.523 ******** ok: [managed-node9] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Saturday 21 March 2026 18:57:41 -0400 (0:00:02.388) 0:00:51.912 ******** skipping: [managed-node9] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node9] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node9] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node9] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Saturday 21 March 2026 18:57:41 -0400 (0:00:00.347) 0:00:52.260 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Saturday 21 March 2026 18:57:41 -0400 (0:00:00.193) 0:00:52.454 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Saturday 21 March 2026 18:57:41 -0400 (0:00:00.115) 0:00:52.569 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Saturday 21 March 2026 18:57:41 -0400 (0:00:00.108) 0:00:52.678 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Saturday 21 March 2026 18:57:41 -0400 (0:00:00.064) 0:00:52.743 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Saturday 21 March 2026 18:57:42 -0400 (0:00:00.263) 0:00:53.006 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Saturday 21 March 2026 18:57:42 -0400 (0:00:00.198) 0:00:53.204 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Saturday 21 March 2026 18:57:42 -0400 (0:00:00.121) 0:00:53.326 ******** ok: [managed-node9] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kmod-kvdo libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python3-blivet stratis-cli stratisd vdo xfsprogs TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Saturday 21 March 2026 18:57:46 -0400 (0:00:03.708) 0:00:57.034 ******** ok: [managed-node9] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Saturday 21 March 2026 18:57:46 -0400 (0:00:00.227) 0:00:57.262 ******** ok: [managed-node9] => { "storage_volumes | d([])": [ { "disks": [ "sda" ], "encryption": true, "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Saturday 21 March 2026 18:57:46 -0400 (0:00:00.281) 0:00:57.543 ******** ok: [managed-node9] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Saturday 21 March 2026 18:57:51 -0400 (0:00:04.983) 0:01:02.527 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Saturday 21 March 2026 18:57:52 -0400 (0:00:00.408) 0:01:02.936 ******** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Saturday 21 March 2026 18:57:52 -0400 (0:00:00.157) 0:01:03.093 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Saturday 21 March 2026 18:57:52 -0400 (0:00:00.180) 0:01:03.273 ******** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Saturday 21 March 2026 18:57:52 -0400 (0:00:00.161) 0:01:03.435 ******** ok: [managed-node9] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup kpartx TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Saturday 21 March 2026 18:57:56 -0400 (0:00:04.194) 0:01:07.629 ******** ok: [managed-node9] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Saturday 21 March 2026 18:57:59 -0400 (0:00:02.425) 0:01:10.054 ******** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Saturday 21 March 2026 18:57:59 -0400 (0:00:00.218) 0:01:10.273 ******** fatal: [managed-node9]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: encrypted volume 'foo' missing key/password TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129 Saturday 21 March 2026 18:58:04 -0400 (0:00:04.989) 0:01:15.262 ******** fatal: [managed-node9]: FAILED! => { "changed": false } MSG: {'msg': "encrypted volume 'foo' missing key/password", 'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'invocation': {'module_args': {'pools': [], 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'foo', 'raid_level': None, 'size': 10737418240, 'state': 'present', 'type': 'disk', 'disks': ['sda'], 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_stripe_size': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': True, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Saturday 21 March 2026 18:58:04 -0400 (0:00:00.187) 0:01:15.450 ******** TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:22 Saturday 21 March 2026 18:58:04 -0400 (0:00:00.276) 0:01:15.726 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:27 Saturday 21 March 2026 18:58:05 -0400 (0:00:00.167) 0:01:15.894 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:38 Saturday 21 March 2026 18:58:05 -0400 (0:00:00.251) 0:01:16.145 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Create an encrypted disk volume w/ default fs] *************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:99 Saturday 21 March 2026 18:58:05 -0400 (0:00:00.156) 0:01:16.302 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node9 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:24 Saturday 21 March 2026 18:58:05 -0400 (0:00:00.331) 0:01:16.634 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:34 Saturday 21 March 2026 18:58:05 -0400 (0:00:00.174) 0:01:16.809 ******** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Saturday 21 March 2026 18:58:06 -0400 (0:00:00.186) 0:01:16.995 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Saturday 21 March 2026 18:58:06 -0400 (0:00:00.130) 0:01:17.126 ******** ok: [managed-node9] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Saturday 21 March 2026 18:58:07 -0400 (0:00:01.426) 0:01:18.552 ******** skipping: [managed-node9] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node9] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node9] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node9] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Saturday 21 March 2026 18:58:07 -0400 (0:00:00.206) 0:01:18.759 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Saturday 21 March 2026 18:58:08 -0400 (0:00:00.196) 0:01:18.956 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Saturday 21 March 2026 18:58:08 -0400 (0:00:00.302) 0:01:19.258 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Saturday 21 March 2026 18:58:08 -0400 (0:00:00.106) 0:01:19.364 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Saturday 21 March 2026 18:58:08 -0400 (0:00:00.148) 0:01:19.513 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Saturday 21 March 2026 18:58:08 -0400 (0:00:00.312) 0:01:19.826 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Saturday 21 March 2026 18:58:09 -0400 (0:00:00.241) 0:01:20.068 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Saturday 21 March 2026 18:58:09 -0400 (0:00:00.157) 0:01:20.226 ******** ok: [managed-node9] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kmod-kvdo libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python3-blivet stratis-cli stratisd vdo xfsprogs TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Saturday 21 March 2026 18:58:13 -0400 (0:00:04.191) 0:01:24.417 ******** ok: [managed-node9] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Saturday 21 March 2026 18:58:13 -0400 (0:00:00.248) 0:01:24.666 ******** ok: [managed-node9] => { "storage_volumes | d([])": [ { "disks": [ "sda" ], "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Saturday 21 March 2026 18:58:14 -0400 (0:00:00.255) 0:01:24.922 ******** ok: [managed-node9] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Saturday 21 March 2026 18:58:19 -0400 (0:00:04.944) 0:01:29.866 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Saturday 21 March 2026 18:58:19 -0400 (0:00:00.379) 0:01:30.246 ******** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Saturday 21 March 2026 18:58:19 -0400 (0:00:00.113) 0:01:30.359 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Saturday 21 March 2026 18:58:19 -0400 (0:00:00.153) 0:01:30.512 ******** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Saturday 21 March 2026 18:58:19 -0400 (0:00:00.109) 0:01:30.622 ******** ok: [managed-node9] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup kpartx TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Saturday 21 March 2026 18:58:24 -0400 (0:00:04.261) 0:01:34.883 ******** ok: [managed-node9] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Saturday 21 March 2026 18:58:26 -0400 (0:00:02.889) 0:01:37.773 ******** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Saturday 21 March 2026 18:58:27 -0400 (0:00:00.425) 0:01:38.198 ******** changed: [managed-node9] => { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-650cacf1-b078-496c-9d32-c0b5fdf3584c", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-650cacf1-b078-496c-9d32-c0b5fdf3584c", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-650cacf1-b078-496c-9d32-c0b5fdf3584c", "password": "-", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-650cacf1-b078-496c-9d32-c0b5fdf3584c" ], "mounts": [ { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-650cacf1-b078-496c-9d32-c0b5fdf3584c", "state": "mounted" } ], "packages": [ "xfsprogs", "cryptsetup" ], "pools": [], "volumes": [ { "_device": "/dev/mapper/luks-650cacf1-b078-496c-9d32-c0b5fdf3584c", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-650cacf1-b078-496c-9d32-c0b5fdf3584c", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:103 Saturday 21 March 2026 18:58:40 -0400 (0:00:13.429) 0:01:51.627 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:110 Saturday 21 March 2026 18:58:41 -0400 (0:00:00.251) 0:01:51.879 ******** ok: [managed-node9] => { "changed": false, "stat": { "atime": 1774133756.6355941, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "ab8070345774adad92683e9645714452be7be474", "ctime": 1774133754.8795838, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 364904581, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1774133754.8795838, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1343, "uid": 0, "version": "133446622", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Saturday 21 March 2026 18:58:42 -0400 (0:00:01.645) 0:01:53.525 ******** ok: [managed-node9] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Saturday 21 March 2026 18:58:45 -0400 (0:00:02.375) 0:01:55.900 ******** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:139 Saturday 21 March 2026 18:58:45 -0400 (0:00:00.249) 0:01:56.149 ******** ok: [managed-node9] => { "blivet_output": { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-650cacf1-b078-496c-9d32-c0b5fdf3584c", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-650cacf1-b078-496c-9d32-c0b5fdf3584c", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-650cacf1-b078-496c-9d32-c0b5fdf3584c", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-650cacf1-b078-496c-9d32-c0b5fdf3584c" ], "mounts": [ { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-650cacf1-b078-496c-9d32-c0b5fdf3584c", "state": "mounted" } ], "packages": [ "xfsprogs", "cryptsetup" ], "pools": [], "volumes": [ { "_device": "/dev/mapper/luks-650cacf1-b078-496c-9d32-c0b5fdf3584c", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-650cacf1-b078-496c-9d32-c0b5fdf3584c", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Saturday 21 March 2026 18:58:45 -0400 (0:00:00.171) 0:01:56.321 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:152 Saturday 21 March 2026 18:58:45 -0400 (0:00:00.179) 0:01:56.500 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/mapper/luks-650cacf1-b078-496c-9d32-c0b5fdf3584c", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-650cacf1-b078-496c-9d32-c0b5fdf3584c", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:168 Saturday 21 March 2026 18:58:45 -0400 (0:00:00.291) 0:01:56.791 ******** TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:179 Saturday 21 March 2026 18:58:46 -0400 (0:00:00.121) 0:01:56.913 ******** ok: [managed-node9] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:184 Saturday 21 March 2026 18:58:50 -0400 (0:00:04.284) 0:02:01.198 ******** changed: [managed-node9] => (item={'src': '/dev/mapper/luks-650cacf1-b078-496c-9d32-c0b5fdf3584c', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-650cacf1-b078-496c-9d32-c0b5fdf3584c", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-650cacf1-b078-496c-9d32-c0b5fdf3584c" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Saturday 21 March 2026 18:58:52 -0400 (0:00:02.364) 0:02:03.562 ******** skipping: [managed-node9] => (item={'src': '/dev/mapper/luks-650cacf1-b078-496c-9d32-c0b5fdf3584c', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-650cacf1-b078-496c-9d32-c0b5fdf3584c", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:207 Saturday 21 March 2026 18:58:52 -0400 (0:00:00.242) 0:02:03.805 ******** ok: [managed-node9] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:215 Saturday 21 March 2026 18:58:54 -0400 (0:00:01.503) 0:02:05.308 ******** ok: [managed-node9] => { "changed": false, "stat": { "atime": 1774132749.8346348, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1716968941.893, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 135, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1716968586.525, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "1157759751", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:220 Saturday 21 March 2026 18:58:55 -0400 (0:00:01.336) 0:02:06.644 ******** changed: [managed-node9] => (item={'backing_device': '/dev/sda', 'name': 'luks-650cacf1-b078-496c-9d32-c0b5fdf3584c', 'password': '-', 'state': 'present'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda", "name": "luks-650cacf1-b078-496c-9d32-c0b5fdf3584c", "password": "-", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:242 Saturday 21 March 2026 18:58:57 -0400 (0:00:01.489) 0:02:08.133 ******** ok: [managed-node9] TASK [Verify role results] ***************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:110 Saturday 21 March 2026 18:58:59 -0400 (0:00:01.922) 0:02:10.056 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node9 TASK [Print out pool information] ********************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Saturday 21 March 2026 18:58:59 -0400 (0:00:00.365) 0:02:10.421 ******** skipping: [managed-node9] => {} TASK [Print out volume information] ******************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Saturday 21 March 2026 18:58:59 -0400 (0:00:00.189) 0:02:10.611 ******** ok: [managed-node9] => { "_storage_volumes_list": [ { "_device": "/dev/mapper/luks-650cacf1-b078-496c-9d32-c0b5fdf3584c", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-650cacf1-b078-496c-9d32-c0b5fdf3584c", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Saturday 21 March 2026 18:59:00 -0400 (0:00:00.265) 0:02:10.876 ******** ok: [managed-node9] => { "changed": false, "info": { "/dev/mapper/luks-650cacf1-b078-496c-9d32-c0b5fdf3584c": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-650cacf1-b078-496c-9d32-c0b5fdf3584c", "size": "10G", "type": "crypt", "uuid": "e21fd51b-03c8-4716-94ec-68aa8559ab45" }, "/dev/sda": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "650cacf1-b078-496c-9d32-c0b5fdf3584c" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Saturday 21 March 2026 18:59:03 -0400 (0:00:03.101) 0:02:13.978 ******** ok: [managed-node9] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002592", "end": "2026-03-21 18:59:05.720845", "rc": 0, "start": "2026-03-21 18:59:05.718253" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-650cacf1-b078-496c-9d32-c0b5fdf3584c /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Saturday 21 March 2026 18:59:05 -0400 (0:00:02.813) 0:02:16.791 ******** ok: [managed-node9] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002616", "end": "2026-03-21 18:59:06.704755", "failed_when_result": false, "rc": 0, "start": "2026-03-21 18:59:06.702139" } STDOUT: luks-650cacf1-b078-496c-9d32-c0b5fdf3584c /dev/sda - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Saturday 21 March 2026 18:59:06 -0400 (0:00:01.033) 0:02:17.824 ******** TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Saturday 21 March 2026 18:59:07 -0400 (0:00:00.187) 0:02:18.012 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node9 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Saturday 21 March 2026 18:59:07 -0400 (0:00:00.241) 0:02:18.254 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Saturday 21 March 2026 18:59:07 -0400 (0:00:00.283) 0:02:18.538 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node9 included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node9 included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node9 included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node9 included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node9 included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node9 included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node9 included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node9 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Saturday 21 March 2026 18:59:08 -0400 (0:00:01.183) 0:02:19.721 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-650cacf1-b078-496c-9d32-c0b5fdf3584c" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Saturday 21 March 2026 18:59:09 -0400 (0:00:00.155) 0:02:19.877 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Saturday 21 March 2026 18:59:09 -0400 (0:00:00.216) 0:02:20.093 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Saturday 21 March 2026 18:59:09 -0400 (0:00:00.399) 0:02:20.493 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Saturday 21 March 2026 18:59:09 -0400 (0:00:00.261) 0:02:20.754 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Saturday 21 March 2026 18:59:10 -0400 (0:00:00.226) 0:02:20.980 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Saturday 21 March 2026 18:59:10 -0400 (0:00:00.346) 0:02:21.327 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Saturday 21 March 2026 18:59:10 -0400 (0:00:00.476) 0:02:21.803 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Saturday 21 March 2026 18:59:11 -0400 (0:00:00.224) 0:02:22.028 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Saturday 21 March 2026 18:59:11 -0400 (0:00:00.272) 0:02:22.301 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Saturday 21 March 2026 18:59:11 -0400 (0:00:00.251) 0:02:22.553 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Saturday 21 March 2026 18:59:11 -0400 (0:00:00.161) 0:02:22.714 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-650cacf1-b078-496c-9d32-c0b5fdf3584c " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Saturday 21 March 2026 18:59:12 -0400 (0:00:00.452) 0:02:23.167 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Saturday 21 March 2026 18:59:12 -0400 (0:00:00.227) 0:02:23.395 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Saturday 21 March 2026 18:59:12 -0400 (0:00:00.233) 0:02:23.628 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Saturday 21 March 2026 18:59:13 -0400 (0:00:00.237) 0:02:23.866 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Saturday 21 March 2026 18:59:13 -0400 (0:00:00.264) 0:02:24.131 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Saturday 21 March 2026 18:59:13 -0400 (0:00:00.199) 0:02:24.331 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Saturday 21 March 2026 18:59:13 -0400 (0:00:00.278) 0:02:24.610 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Saturday 21 March 2026 18:59:14 -0400 (0:00:00.305) 0:02:24.915 ******** ok: [managed-node9] => { "changed": false, "stat": { "atime": 1774133920.352539, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1774133920.352539, "dev": 6, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 36050, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1774133920.352539, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Saturday 21 March 2026 18:59:15 -0400 (0:00:01.277) 0:02:26.193 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Saturday 21 March 2026 18:59:15 -0400 (0:00:00.273) 0:02:26.466 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Saturday 21 March 2026 18:59:15 -0400 (0:00:00.292) 0:02:26.758 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Saturday 21 March 2026 18:59:16 -0400 (0:00:00.259) 0:02:27.018 ******** ok: [managed-node9] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Saturday 21 March 2026 18:59:16 -0400 (0:00:00.201) 0:02:27.220 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Saturday 21 March 2026 18:59:16 -0400 (0:00:00.132) 0:02:27.353 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Saturday 21 March 2026 18:59:16 -0400 (0:00:00.224) 0:02:27.577 ******** ok: [managed-node9] => { "changed": false, "stat": { "atime": 1774133920.49454, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1774133920.49454, "dev": 6, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 173169, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1774133920.49454, "nlink": 1, "path": "/dev/mapper/luks-650cacf1-b078-496c-9d32-c0b5fdf3584c", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Saturday 21 March 2026 18:59:18 -0400 (0:00:01.282) 0:02:28.860 ******** ok: [managed-node9] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Saturday 21 March 2026 18:59:21 -0400 (0:00:03.503) 0:02:32.363 ******** ok: [managed-node9] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/sda" ], "delta": "0:00:00.011871", "end": "2026-03-21 18:59:22.934495", "rc": 0, "start": "2026-03-21 18:59:22.922624" } STDOUT: LUKS header information Version: 2 Epoch: 3 Metadata area: 16384 [bytes] Keyslots area: 16744448 [bytes] UUID: 650cacf1-b078-496c-9d32-c0b5fdf3584c Label: (no label) Subsystem: (no subsystem) Flags: (no flags) Data segments: 0: crypt offset: 16777216 [bytes] length: (whole device) cipher: aes-xts-plain64 sector: 512 [bytes] Keyslots: 0: luks2 Key: 512 bits Priority: normal Cipher: aes-xts-plain64 Cipher key: 512 bits PBKDF: argon2i Time cost: 4 Memory: 919618 Threads: 2 Salt: 67 72 5f 70 61 d7 f9 34 47 c1 46 3a bd f9 45 85 26 1f a9 2b 65 62 72 cb 0c 70 d4 af 96 59 8e ae AF stripes: 4000 AF hash: sha256 Area offset:32768 [bytes] Area length:258048 [bytes] Digest ID: 0 Tokens: Digests: 0: pbkdf2 Hash: sha256 Iterations: 120029 Salt: 7e 7f b4 16 ab 09 42 28 cc 85 7d 71 ee f4 c1 30 d6 d8 9a b4 85 83 3a bc 8c 07 a7 a3 db 07 5c 39 Digest: fe ea 8e 8f 2c 38 e9 60 a1 a6 48 59 a2 d7 87 57 6a 68 50 2c b2 4d 9f aa da c4 c7 ab db ad 02 24 TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Saturday 21 March 2026 18:59:23 -0400 (0:00:01.684) 0:02:34.047 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Saturday 21 March 2026 18:59:23 -0400 (0:00:00.237) 0:02:34.285 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Saturday 21 March 2026 18:59:23 -0400 (0:00:00.298) 0:02:34.583 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Saturday 21 March 2026 18:59:23 -0400 (0:00:00.195) 0:02:34.778 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Saturday 21 March 2026 18:59:24 -0400 (0:00:00.194) 0:02:34.973 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Saturday 21 March 2026 18:59:24 -0400 (0:00:00.289) 0:02:35.263 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Saturday 21 March 2026 18:59:24 -0400 (0:00:00.294) 0:02:35.557 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Saturday 21 March 2026 18:59:24 -0400 (0:00:00.231) 0:02:35.789 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-650cacf1-b078-496c-9d32-c0b5fdf3584c /dev/sda -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Saturday 21 March 2026 18:59:25 -0400 (0:00:00.193) 0:02:35.982 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Saturday 21 March 2026 18:59:25 -0400 (0:00:00.207) 0:02:36.189 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Saturday 21 March 2026 18:59:25 -0400 (0:00:00.230) 0:02:36.420 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Saturday 21 March 2026 18:59:25 -0400 (0:00:00.255) 0:02:36.675 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Saturday 21 March 2026 18:59:26 -0400 (0:00:00.195) 0:02:36.871 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Saturday 21 March 2026 18:59:26 -0400 (0:00:00.211) 0:02:37.082 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Saturday 21 March 2026 18:59:26 -0400 (0:00:00.214) 0:02:37.296 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Saturday 21 March 2026 18:59:26 -0400 (0:00:00.268) 0:02:37.565 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Saturday 21 March 2026 18:59:27 -0400 (0:00:00.303) 0:02:37.869 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Saturday 21 March 2026 18:59:27 -0400 (0:00:00.191) 0:02:38.061 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Saturday 21 March 2026 18:59:27 -0400 (0:00:00.174) 0:02:38.236 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Saturday 21 March 2026 18:59:27 -0400 (0:00:00.180) 0:02:38.417 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Saturday 21 March 2026 18:59:27 -0400 (0:00:00.213) 0:02:38.630 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Saturday 21 March 2026 18:59:28 -0400 (0:00:00.242) 0:02:38.873 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Saturday 21 March 2026 18:59:28 -0400 (0:00:00.310) 0:02:39.183 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Saturday 21 March 2026 18:59:28 -0400 (0:00:00.227) 0:02:39.411 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Saturday 21 March 2026 18:59:28 -0400 (0:00:00.146) 0:02:39.557 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Saturday 21 March 2026 18:59:28 -0400 (0:00:00.260) 0:02:39.818 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Saturday 21 March 2026 18:59:29 -0400 (0:00:00.156) 0:02:39.975 ******** ok: [managed-node9] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Saturday 21 March 2026 18:59:29 -0400 (0:00:00.301) 0:02:40.277 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Saturday 21 March 2026 18:59:29 -0400 (0:00:00.211) 0:02:40.488 ******** skipping: [managed-node9] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Saturday 21 March 2026 18:59:29 -0400 (0:00:00.193) 0:02:40.681 ******** skipping: [managed-node9] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Saturday 21 March 2026 18:59:30 -0400 (0:00:00.195) 0:02:40.876 ******** skipping: [managed-node9] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Saturday 21 March 2026 18:59:30 -0400 (0:00:00.219) 0:02:41.096 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Saturday 21 March 2026 18:59:30 -0400 (0:00:00.213) 0:02:41.309 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Saturday 21 March 2026 18:59:30 -0400 (0:00:00.178) 0:02:41.488 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Saturday 21 March 2026 18:59:30 -0400 (0:00:00.190) 0:02:41.679 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Saturday 21 March 2026 18:59:31 -0400 (0:00:00.240) 0:02:41.920 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Saturday 21 March 2026 18:59:31 -0400 (0:00:00.228) 0:02:42.148 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Saturday 21 March 2026 18:59:31 -0400 (0:00:00.195) 0:02:42.344 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Saturday 21 March 2026 18:59:31 -0400 (0:00:00.232) 0:02:42.577 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Saturday 21 March 2026 18:59:31 -0400 (0:00:00.153) 0:02:42.731 ******** skipping: [managed-node9] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Saturday 21 March 2026 18:59:32 -0400 (0:00:00.268) 0:02:42.999 ******** skipping: [managed-node9] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Saturday 21 March 2026 18:59:32 -0400 (0:00:00.052) 0:02:43.051 ******** skipping: [managed-node9] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Saturday 21 March 2026 18:59:32 -0400 (0:00:00.084) 0:02:43.136 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Saturday 21 March 2026 18:59:32 -0400 (0:00:00.085) 0:02:43.221 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Saturday 21 March 2026 18:59:32 -0400 (0:00:00.226) 0:02:43.448 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Saturday 21 March 2026 18:59:32 -0400 (0:00:00.324) 0:02:43.772 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Saturday 21 March 2026 18:59:33 -0400 (0:00:00.244) 0:02:44.017 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Saturday 21 March 2026 18:59:33 -0400 (0:00:00.167) 0:02:44.184 ******** ok: [managed-node9] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Saturday 21 March 2026 18:59:33 -0400 (0:00:00.201) 0:02:44.386 ******** ok: [managed-node9] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Saturday 21 March 2026 18:59:33 -0400 (0:00:00.298) 0:02:44.685 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Saturday 21 March 2026 18:59:34 -0400 (0:00:00.312) 0:02:44.997 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Saturday 21 March 2026 18:59:34 -0400 (0:00:00.244) 0:02:45.242 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Saturday 21 March 2026 18:59:34 -0400 (0:00:00.373) 0:02:45.615 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Saturday 21 March 2026 18:59:35 -0400 (0:00:00.290) 0:02:45.905 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Saturday 21 March 2026 18:59:35 -0400 (0:00:00.272) 0:02:46.178 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Saturday 21 March 2026 18:59:35 -0400 (0:00:00.182) 0:02:46.360 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Saturday 21 March 2026 18:59:35 -0400 (0:00:00.303) 0:02:46.664 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Saturday 21 March 2026 18:59:36 -0400 (0:00:00.202) 0:02:46.867 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Saturday 21 March 2026 18:59:36 -0400 (0:00:00.212) 0:02:47.079 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Saturday 21 March 2026 18:59:36 -0400 (0:00:00.180) 0:02:47.259 ******** changed: [managed-node9] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode] ********************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:116 Saturday 21 March 2026 18:59:39 -0400 (0:00:02.808) 0:02:50.068 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node9 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Saturday 21 March 2026 18:59:39 -0400 (0:00:00.598) 0:02:50.666 ******** ok: [managed-node9] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Saturday 21 March 2026 18:59:40 -0400 (0:00:00.208) 0:02:50.875 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node9 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:24 Saturday 21 March 2026 18:59:40 -0400 (0:00:00.224) 0:02:51.099 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:34 Saturday 21 March 2026 18:59:40 -0400 (0:00:00.224) 0:02:51.324 ******** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Saturday 21 March 2026 18:59:40 -0400 (0:00:00.296) 0:02:51.621 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Saturday 21 March 2026 18:59:40 -0400 (0:00:00.209) 0:02:51.830 ******** ok: [managed-node9] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Saturday 21 March 2026 18:59:43 -0400 (0:00:02.569) 0:02:54.400 ******** skipping: [managed-node9] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node9] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node9] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node9] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Saturday 21 March 2026 18:59:44 -0400 (0:00:00.562) 0:02:54.962 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Saturday 21 March 2026 18:59:44 -0400 (0:00:00.284) 0:02:55.247 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Saturday 21 March 2026 18:59:44 -0400 (0:00:00.265) 0:02:55.512 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Saturday 21 March 2026 18:59:44 -0400 (0:00:00.297) 0:02:55.809 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Saturday 21 March 2026 18:59:45 -0400 (0:00:00.145) 0:02:55.955 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Saturday 21 March 2026 18:59:45 -0400 (0:00:00.466) 0:02:56.421 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Saturday 21 March 2026 18:59:45 -0400 (0:00:00.221) 0:02:56.643 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Saturday 21 March 2026 18:59:45 -0400 (0:00:00.193) 0:02:56.836 ******** ok: [managed-node9] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kmod-kvdo libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python3-blivet stratis-cli stratisd vdo xfsprogs TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Saturday 21 March 2026 18:59:50 -0400 (0:00:04.500) 0:03:01.337 ******** ok: [managed-node9] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Saturday 21 March 2026 18:59:50 -0400 (0:00:00.247) 0:03:01.584 ******** ok: [managed-node9] => { "storage_volumes | d([])": [ { "disks": [ "sda" ], "encryption": false, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Saturday 21 March 2026 18:59:50 -0400 (0:00:00.163) 0:03:01.748 ******** ok: [managed-node9] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Saturday 21 March 2026 18:59:56 -0400 (0:00:05.115) 0:03:06.864 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Saturday 21 March 2026 18:59:56 -0400 (0:00:00.226) 0:03:07.090 ******** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Saturday 21 March 2026 18:59:56 -0400 (0:00:00.106) 0:03:07.197 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Saturday 21 March 2026 18:59:56 -0400 (0:00:00.140) 0:03:07.337 ******** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Saturday 21 March 2026 18:59:56 -0400 (0:00:00.114) 0:03:07.452 ******** ok: [managed-node9] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kpartx TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Saturday 21 March 2026 19:00:00 -0400 (0:00:03.456) 0:03:10.909 ******** ok: [managed-node9] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Saturday 21 March 2026 19:00:02 -0400 (0:00:02.656) 0:03:13.565 ******** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Saturday 21 March 2026 19:00:02 -0400 (0:00:00.203) 0:03:13.768 ******** fatal: [managed-node9]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'luks-650cacf1-b078-496c-9d32-c0b5fdf3584c' in safe mode due to encryption removal TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129 Saturday 21 March 2026 19:00:07 -0400 (0:00:04.969) 0:03:18.737 ******** fatal: [managed-node9]: FAILED! => { "changed": false } MSG: {'msg': "cannot remove existing formatting on device 'luks-650cacf1-b078-496c-9d32-c0b5fdf3584c' in safe mode due to encryption removal", 'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'invocation': {'module_args': {'pools': [], 'volumes': [{'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'foo', 'raid_level': None, 'size': 10720641024, 'state': 'present', 'type': 'disk', 'disks': ['sda'], 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_stripe_size': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': True, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Saturday 21 March 2026 19:00:08 -0400 (0:00:00.247) 0:03:18.985 ******** TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:22 Saturday 21 March 2026 19:00:08 -0400 (0:00:00.243) 0:03:19.229 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:27 Saturday 21 March 2026 19:00:08 -0400 (0:00:00.216) 0:03:19.445 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:38 Saturday 21 March 2026 19:00:08 -0400 (0:00:00.262) 0:03:19.708 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Saturday 21 March 2026 19:00:09 -0400 (0:00:00.151) 0:03:19.859 ******** ok: [managed-node9] => { "changed": false, "stat": { "atime": 1774133978.962874, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1774133978.962874, "dev": 64768, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1774133978.962874, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "1239889724", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Saturday 21 March 2026 19:00:10 -0400 (0:00:01.483) 0:03:21.342 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Remove the encryption layer] ********************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:136 Saturday 21 March 2026 19:00:10 -0400 (0:00:00.209) 0:03:21.552 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node9 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:24 Saturday 21 March 2026 19:00:11 -0400 (0:00:00.460) 0:03:22.013 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:34 Saturday 21 March 2026 19:00:11 -0400 (0:00:00.159) 0:03:22.172 ******** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Saturday 21 March 2026 19:00:11 -0400 (0:00:00.262) 0:03:22.435 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Saturday 21 March 2026 19:00:11 -0400 (0:00:00.236) 0:03:22.671 ******** ok: [managed-node9] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Saturday 21 March 2026 19:00:14 -0400 (0:00:02.366) 0:03:25.038 ******** skipping: [managed-node9] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node9] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node9] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node9] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Saturday 21 March 2026 19:00:14 -0400 (0:00:00.362) 0:03:25.400 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Saturday 21 March 2026 19:00:14 -0400 (0:00:00.177) 0:03:25.577 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Saturday 21 March 2026 19:00:15 -0400 (0:00:00.317) 0:03:25.895 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Saturday 21 March 2026 19:00:15 -0400 (0:00:00.134) 0:03:26.030 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Saturday 21 March 2026 19:00:15 -0400 (0:00:00.084) 0:03:26.114 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Saturday 21 March 2026 19:00:15 -0400 (0:00:00.327) 0:03:26.442 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Saturday 21 March 2026 19:00:15 -0400 (0:00:00.194) 0:03:26.637 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Saturday 21 March 2026 19:00:15 -0400 (0:00:00.093) 0:03:26.731 ******** ok: [managed-node9] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kmod-kvdo libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python3-blivet stratis-cli stratisd vdo xfsprogs TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Saturday 21 March 2026 19:00:20 -0400 (0:00:04.213) 0:03:30.944 ******** ok: [managed-node9] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Saturday 21 March 2026 19:00:20 -0400 (0:00:00.178) 0:03:31.123 ******** ok: [managed-node9] => { "storage_volumes | d([])": [ { "disks": [ "sda" ], "encryption": false, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Saturday 21 March 2026 19:00:20 -0400 (0:00:00.131) 0:03:31.254 ******** ok: [managed-node9] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Saturday 21 March 2026 19:00:25 -0400 (0:00:04.892) 0:03:36.147 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Saturday 21 March 2026 19:00:25 -0400 (0:00:00.182) 0:03:36.329 ******** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Saturday 21 March 2026 19:00:25 -0400 (0:00:00.110) 0:03:36.440 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Saturday 21 March 2026 19:00:25 -0400 (0:00:00.159) 0:03:36.599 ******** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Saturday 21 March 2026 19:00:25 -0400 (0:00:00.147) 0:03:36.747 ******** ok: [managed-node9] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kpartx TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Saturday 21 March 2026 19:00:29 -0400 (0:00:03.971) 0:03:40.718 ******** ok: [managed-node9] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Saturday 21 March 2026 19:00:32 -0400 (0:00:02.874) 0:03:43.593 ******** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Saturday 21 March 2026 19:00:33 -0400 (0:00:00.266) 0:03:43.859 ******** changed: [managed-node9] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-650cacf1-b078-496c-9d32-c0b5fdf3584c", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-650cacf1-b078-496c-9d32-c0b5fdf3584c", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-650cacf1-b078-496c-9d32-c0b5fdf3584c", "password": "-", "state": "absent" } ], "leaves": [ "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-650cacf1-b078-496c-9d32-c0b5fdf3584c", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=852fa207-9067-4e60-9dbb-4d9de692146f", "state": "mounted" } ], "packages": [ "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=852fa207-9067-4e60-9dbb-4d9de692146f", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10720641024, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:103 Saturday 21 March 2026 19:00:38 -0400 (0:00:05.538) 0:03:49.398 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:110 Saturday 21 March 2026 19:00:38 -0400 (0:00:00.246) 0:03:49.645 ******** ok: [managed-node9] => { "changed": false, "stat": { "atime": 1774133932.426608, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "a6c980fca45342f22279697dc322fe6be468bc4a", "ctime": 1774133932.423608, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 364904581, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1774133932.423608, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1425, "uid": 0, "version": "133446622", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Saturday 21 March 2026 19:00:40 -0400 (0:00:01.478) 0:03:51.123 ******** ok: [managed-node9] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Saturday 21 March 2026 19:00:41 -0400 (0:00:01.093) 0:03:52.216 ******** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:139 Saturday 21 March 2026 19:00:41 -0400 (0:00:00.186) 0:03:52.403 ******** ok: [managed-node9] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-650cacf1-b078-496c-9d32-c0b5fdf3584c", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-650cacf1-b078-496c-9d32-c0b5fdf3584c", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-650cacf1-b078-496c-9d32-c0b5fdf3584c", "password": "-", "state": "absent" } ], "failed": false, "leaves": [ "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-650cacf1-b078-496c-9d32-c0b5fdf3584c", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=852fa207-9067-4e60-9dbb-4d9de692146f", "state": "mounted" } ], "packages": [ "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=852fa207-9067-4e60-9dbb-4d9de692146f", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10720641024, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Saturday 21 March 2026 19:00:41 -0400 (0:00:00.154) 0:03:52.558 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:152 Saturday 21 March 2026 19:00:41 -0400 (0:00:00.256) 0:03:52.815 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=852fa207-9067-4e60-9dbb-4d9de692146f", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10720641024, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:168 Saturday 21 March 2026 19:00:42 -0400 (0:00:00.159) 0:03:52.974 ******** changed: [managed-node9] => (item={'src': '/dev/mapper/luks-650cacf1-b078-496c-9d32-c0b5fdf3584c', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-650cacf1-b078-496c-9d32-c0b5fdf3584c", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-650cacf1-b078-496c-9d32-c0b5fdf3584c" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:179 Saturday 21 March 2026 19:00:43 -0400 (0:00:01.346) 0:03:54.320 ******** ok: [managed-node9] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:184 Saturday 21 March 2026 19:00:45 -0400 (0:00:01.806) 0:03:56.127 ******** changed: [managed-node9] => (item={'src': 'UUID=852fa207-9067-4e60-9dbb-4d9de692146f', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=852fa207-9067-4e60-9dbb-4d9de692146f", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=852fa207-9067-4e60-9dbb-4d9de692146f" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Saturday 21 March 2026 19:00:46 -0400 (0:00:01.289) 0:03:57.416 ******** skipping: [managed-node9] => (item={'src': 'UUID=852fa207-9067-4e60-9dbb-4d9de692146f', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=852fa207-9067-4e60-9dbb-4d9de692146f", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:207 Saturday 21 March 2026 19:00:46 -0400 (0:00:00.259) 0:03:57.676 ******** ok: [managed-node9] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:215 Saturday 21 March 2026 19:00:48 -0400 (0:00:01.867) 0:03:59.544 ******** ok: [managed-node9] => { "changed": false, "stat": { "atime": 1774133946.7036896, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "33f43960139f68e88b9c68b3e2a656c958e26c87", "ctime": 1774133936.9996343, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 262144197, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1774133936.998634, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 53, "uid": 0, "version": "2472704517", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:220 Saturday 21 March 2026 19:00:50 -0400 (0:00:01.648) 0:04:01.192 ******** changed: [managed-node9] => (item={'backing_device': '/dev/sda', 'name': 'luks-650cacf1-b078-496c-9d32-c0b5fdf3584c', 'password': '-', 'state': 'absent'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda", "name": "luks-650cacf1-b078-496c-9d32-c0b5fdf3584c", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:242 Saturday 21 March 2026 19:00:51 -0400 (0:00:01.379) 0:04:02.571 ******** ok: [managed-node9] TASK [Verify role results - 2] ************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:148 Saturday 21 March 2026 19:00:53 -0400 (0:00:01.974) 0:04:04.546 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node9 TASK [Print out pool information] ********************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Saturday 21 March 2026 19:00:54 -0400 (0:00:00.489) 0:04:05.035 ******** skipping: [managed-node9] => {} TASK [Print out volume information] ******************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Saturday 21 March 2026 19:00:54 -0400 (0:00:00.330) 0:04:05.365 ******** ok: [managed-node9] => { "_storage_volumes_list": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=852fa207-9067-4e60-9dbb-4d9de692146f", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10720641024, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Saturday 21 March 2026 19:00:54 -0400 (0:00:00.304) 0:04:05.670 ******** ok: [managed-node9] => { "changed": false, "info": { "/dev/sda": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "852fa207-9067-4e60-9dbb-4d9de692146f" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Saturday 21 March 2026 19:00:56 -0400 (0:00:01.457) 0:04:07.127 ******** ok: [managed-node9] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002735", "end": "2026-03-21 19:00:57.430983", "rc": 0, "start": "2026-03-21 19:00:57.428248" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 UUID=852fa207-9067-4e60-9dbb-4d9de692146f /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Saturday 21 March 2026 19:00:57 -0400 (0:00:01.335) 0:04:08.463 ******** ok: [managed-node9] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003087", "end": "2026-03-21 19:00:58.930770", "failed_when_result": false, "rc": 0, "start": "2026-03-21 19:00:58.927683" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Saturday 21 March 2026 19:00:59 -0400 (0:00:01.514) 0:04:09.977 ******** TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Saturday 21 March 2026 19:00:59 -0400 (0:00:00.167) 0:04:10.145 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node9 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Saturday 21 March 2026 19:00:59 -0400 (0:00:00.317) 0:04:10.462 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Saturday 21 March 2026 19:00:59 -0400 (0:00:00.248) 0:04:10.711 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node9 included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node9 included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node9 included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node9 included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node9 included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node9 included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node9 included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node9 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Saturday 21 March 2026 19:01:00 -0400 (0:00:00.996) 0:04:11.708 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_device_path": "/dev/sda" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Saturday 21 March 2026 19:01:01 -0400 (0:00:00.232) 0:04:11.941 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Saturday 21 March 2026 19:01:01 -0400 (0:00:00.296) 0:04:12.238 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Saturday 21 March 2026 19:01:01 -0400 (0:00:00.355) 0:04:12.593 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Saturday 21 March 2026 19:01:02 -0400 (0:00:00.275) 0:04:12.868 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Saturday 21 March 2026 19:01:02 -0400 (0:00:00.264) 0:04:13.132 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Saturday 21 March 2026 19:01:02 -0400 (0:00:00.331) 0:04:13.464 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Saturday 21 March 2026 19:01:02 -0400 (0:00:00.225) 0:04:13.690 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Saturday 21 March 2026 19:01:03 -0400 (0:00:00.201) 0:04:13.892 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Saturday 21 March 2026 19:01:03 -0400 (0:00:00.246) 0:04:14.139 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Saturday 21 March 2026 19:01:03 -0400 (0:00:00.219) 0:04:14.358 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Saturday 21 March 2026 19:01:03 -0400 (0:00:00.090) 0:04:14.449 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "UUID=852fa207-9067-4e60-9dbb-4d9de692146f " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Saturday 21 March 2026 19:01:04 -0400 (0:00:00.435) 0:04:14.885 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Saturday 21 March 2026 19:01:04 -0400 (0:00:00.290) 0:04:15.175 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Saturday 21 March 2026 19:01:04 -0400 (0:00:00.282) 0:04:15.458 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Saturday 21 March 2026 19:01:04 -0400 (0:00:00.243) 0:04:15.702 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Saturday 21 March 2026 19:01:05 -0400 (0:00:00.318) 0:04:16.020 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Saturday 21 March 2026 19:01:05 -0400 (0:00:00.197) 0:04:16.218 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Saturday 21 March 2026 19:01:05 -0400 (0:00:00.338) 0:04:16.556 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Saturday 21 March 2026 19:01:05 -0400 (0:00:00.213) 0:04:16.769 ******** ok: [managed-node9] => { "changed": false, "stat": { "atime": 1774134038.265213, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1774134038.265213, "dev": 6, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 36050, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1774134038.265213, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Saturday 21 March 2026 19:01:07 -0400 (0:00:01.506) 0:04:18.276 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Saturday 21 March 2026 19:01:07 -0400 (0:00:00.568) 0:04:18.845 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Saturday 21 March 2026 19:01:08 -0400 (0:00:00.179) 0:04:19.024 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Saturday 21 March 2026 19:01:08 -0400 (0:00:00.201) 0:04:19.225 ******** ok: [managed-node9] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Saturday 21 March 2026 19:01:08 -0400 (0:00:00.205) 0:04:19.430 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Saturday 21 March 2026 19:01:08 -0400 (0:00:00.237) 0:04:19.668 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Saturday 21 March 2026 19:01:09 -0400 (0:00:00.219) 0:04:19.888 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Saturday 21 March 2026 19:01:09 -0400 (0:00:00.228) 0:04:20.117 ******** ok: [managed-node9] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Saturday 21 March 2026 19:01:13 -0400 (0:00:04.406) 0:04:24.523 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Saturday 21 March 2026 19:01:13 -0400 (0:00:00.134) 0:04:24.658 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Saturday 21 March 2026 19:01:14 -0400 (0:00:00.203) 0:04:24.861 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Saturday 21 March 2026 19:01:14 -0400 (0:00:00.244) 0:04:25.105 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Saturday 21 March 2026 19:01:14 -0400 (0:00:00.178) 0:04:25.284 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Saturday 21 March 2026 19:01:14 -0400 (0:00:00.241) 0:04:25.526 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Saturday 21 March 2026 19:01:14 -0400 (0:00:00.170) 0:04:25.696 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Saturday 21 March 2026 19:01:14 -0400 (0:00:00.117) 0:04:25.813 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Saturday 21 March 2026 19:01:15 -0400 (0:00:00.186) 0:04:25.999 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Saturday 21 March 2026 19:01:15 -0400 (0:00:00.259) 0:04:26.259 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Saturday 21 March 2026 19:01:15 -0400 (0:00:00.335) 0:04:26.594 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Saturday 21 March 2026 19:01:15 -0400 (0:00:00.191) 0:04:26.786 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Saturday 21 March 2026 19:01:16 -0400 (0:00:00.203) 0:04:26.989 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Saturday 21 March 2026 19:01:16 -0400 (0:00:00.188) 0:04:27.178 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Saturday 21 March 2026 19:01:16 -0400 (0:00:00.204) 0:04:27.383 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Saturday 21 March 2026 19:01:16 -0400 (0:00:00.167) 0:04:27.550 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Saturday 21 March 2026 19:01:16 -0400 (0:00:00.112) 0:04:27.663 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Saturday 21 March 2026 19:01:17 -0400 (0:00:00.214) 0:04:27.877 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Saturday 21 March 2026 19:01:17 -0400 (0:00:00.143) 0:04:28.021 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Saturday 21 March 2026 19:01:17 -0400 (0:00:00.171) 0:04:28.192 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Saturday 21 March 2026 19:01:17 -0400 (0:00:00.210) 0:04:28.403 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Saturday 21 March 2026 19:01:17 -0400 (0:00:00.211) 0:04:28.614 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Saturday 21 March 2026 19:01:17 -0400 (0:00:00.128) 0:04:28.743 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Saturday 21 March 2026 19:01:18 -0400 (0:00:00.232) 0:04:28.975 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Saturday 21 March 2026 19:01:18 -0400 (0:00:00.198) 0:04:29.174 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Saturday 21 March 2026 19:01:18 -0400 (0:00:00.201) 0:04:29.375 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Saturday 21 March 2026 19:01:18 -0400 (0:00:00.199) 0:04:29.575 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Saturday 21 March 2026 19:01:19 -0400 (0:00:00.291) 0:04:29.866 ******** ok: [managed-node9] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Saturday 21 March 2026 19:01:19 -0400 (0:00:00.162) 0:04:30.029 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Saturday 21 March 2026 19:01:19 -0400 (0:00:00.257) 0:04:30.287 ******** skipping: [managed-node9] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Saturday 21 March 2026 19:01:19 -0400 (0:00:00.259) 0:04:30.546 ******** skipping: [managed-node9] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Saturday 21 March 2026 19:01:19 -0400 (0:00:00.203) 0:04:30.750 ******** skipping: [managed-node9] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Saturday 21 March 2026 19:01:20 -0400 (0:00:00.317) 0:04:31.067 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Saturday 21 March 2026 19:01:20 -0400 (0:00:00.278) 0:04:31.346 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Saturday 21 March 2026 19:01:20 -0400 (0:00:00.302) 0:04:31.648 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Saturday 21 March 2026 19:01:21 -0400 (0:00:00.198) 0:04:31.847 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Saturday 21 March 2026 19:01:21 -0400 (0:00:00.264) 0:04:32.111 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Saturday 21 March 2026 19:01:21 -0400 (0:00:00.182) 0:04:32.294 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Saturday 21 March 2026 19:01:21 -0400 (0:00:00.198) 0:04:32.492 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Saturday 21 March 2026 19:01:21 -0400 (0:00:00.164) 0:04:32.656 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Saturday 21 March 2026 19:01:22 -0400 (0:00:00.247) 0:04:32.904 ******** skipping: [managed-node9] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Saturday 21 March 2026 19:01:22 -0400 (0:00:00.337) 0:04:33.242 ******** skipping: [managed-node9] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Saturday 21 March 2026 19:01:22 -0400 (0:00:00.315) 0:04:33.557 ******** skipping: [managed-node9] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Saturday 21 March 2026 19:01:22 -0400 (0:00:00.244) 0:04:33.802 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Saturday 21 March 2026 19:01:23 -0400 (0:00:00.197) 0:04:33.999 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Saturday 21 March 2026 19:01:23 -0400 (0:00:00.311) 0:04:34.310 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Saturday 21 March 2026 19:01:23 -0400 (0:00:00.262) 0:04:34.573 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Saturday 21 March 2026 19:01:23 -0400 (0:00:00.227) 0:04:34.800 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Saturday 21 March 2026 19:01:24 -0400 (0:00:00.232) 0:04:35.033 ******** ok: [managed-node9] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Saturday 21 March 2026 19:01:24 -0400 (0:00:00.302) 0:04:35.336 ******** ok: [managed-node9] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Saturday 21 March 2026 19:01:24 -0400 (0:00:00.264) 0:04:35.600 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Saturday 21 March 2026 19:01:24 -0400 (0:00:00.182) 0:04:35.783 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Saturday 21 March 2026 19:01:25 -0400 (0:00:00.219) 0:04:36.003 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Saturday 21 March 2026 19:01:25 -0400 (0:00:00.189) 0:04:36.192 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Saturday 21 March 2026 19:01:25 -0400 (0:00:00.215) 0:04:36.408 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Saturday 21 March 2026 19:01:25 -0400 (0:00:00.208) 0:04:36.617 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Saturday 21 March 2026 19:01:25 -0400 (0:00:00.197) 0:04:36.814 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Saturday 21 March 2026 19:01:26 -0400 (0:00:00.170) 0:04:36.985 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Saturday 21 March 2026 19:01:26 -0400 (0:00:00.156) 0:04:37.141 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Saturday 21 March 2026 19:01:26 -0400 (0:00:00.128) 0:04:37.270 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Saturday 21 March 2026 19:01:26 -0400 (0:00:00.074) 0:04:37.345 ******** changed: [managed-node9] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode - 2] ****************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:154 Saturday 21 March 2026 19:01:27 -0400 (0:00:01.350) 0:04:38.695 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node9 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Saturday 21 March 2026 19:01:28 -0400 (0:00:00.401) 0:04:39.097 ******** ok: [managed-node9] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Saturday 21 March 2026 19:01:28 -0400 (0:00:00.265) 0:04:39.362 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node9 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:24 Saturday 21 March 2026 19:01:28 -0400 (0:00:00.286) 0:04:39.649 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:34 Saturday 21 March 2026 19:01:28 -0400 (0:00:00.154) 0:04:39.803 ******** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Saturday 21 March 2026 19:01:29 -0400 (0:00:00.225) 0:04:40.028 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Saturday 21 March 2026 19:01:29 -0400 (0:00:00.145) 0:04:40.174 ******** ok: [managed-node9] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Saturday 21 March 2026 19:01:31 -0400 (0:00:02.067) 0:04:42.241 ******** skipping: [managed-node9] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node9] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node9] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node9] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Saturday 21 March 2026 19:01:31 -0400 (0:00:00.460) 0:04:42.702 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Saturday 21 March 2026 19:01:32 -0400 (0:00:00.272) 0:04:42.974 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Saturday 21 March 2026 19:01:32 -0400 (0:00:00.196) 0:04:43.171 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Saturday 21 March 2026 19:01:32 -0400 (0:00:00.182) 0:04:43.354 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Saturday 21 March 2026 19:01:32 -0400 (0:00:00.124) 0:04:43.478 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Saturday 21 March 2026 19:01:33 -0400 (0:00:00.395) 0:04:43.873 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Saturday 21 March 2026 19:01:33 -0400 (0:00:00.252) 0:04:44.126 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Saturday 21 March 2026 19:01:33 -0400 (0:00:00.243) 0:04:44.370 ******** ok: [managed-node9] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kmod-kvdo libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python3-blivet stratis-cli stratisd vdo xfsprogs TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Saturday 21 March 2026 19:01:38 -0400 (0:00:04.528) 0:04:48.898 ******** ok: [managed-node9] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Saturday 21 March 2026 19:01:38 -0400 (0:00:00.254) 0:04:49.153 ******** ok: [managed-node9] => { "storage_volumes | d([])": [ { "disks": [ "sda" ], "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Saturday 21 March 2026 19:01:38 -0400 (0:00:00.406) 0:04:49.559 ******** ok: [managed-node9] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Saturday 21 March 2026 19:01:44 -0400 (0:00:05.763) 0:04:55.323 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Saturday 21 March 2026 19:01:44 -0400 (0:00:00.300) 0:04:55.624 ******** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Saturday 21 March 2026 19:01:44 -0400 (0:00:00.157) 0:04:55.782 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Saturday 21 March 2026 19:01:45 -0400 (0:00:00.158) 0:04:55.941 ******** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Saturday 21 March 2026 19:01:45 -0400 (0:00:00.149) 0:04:56.090 ******** ok: [managed-node9] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup kpartx TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Saturday 21 March 2026 19:01:49 -0400 (0:00:04.302) 0:05:00.392 ******** ok: [managed-node9] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luk...db078\\x2d496c\\x2d9d32\\x2dc0b5fdf3584c.service": { "name": "systemd-cryptsetup@luk...db078\\x2d496c\\x2d9d32\\x2dc0b5fdf3584c.service", "source": "systemd", "state": "inactive", "status": "generated" }, "systemd-cryptsetup@luks\\x2d650cacf1\\x2db078\\x2d496c\\x2d9d32\\x2dc0b5fdf3584c.service": { "name": "systemd-cryptsetup@luks\\x2d650cacf1\\x2db078\\x2d496c\\x2d9d32\\x2dc0b5fdf3584c.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Saturday 21 March 2026 19:01:52 -0400 (0:00:02.631) 0:05:03.024 ******** changed: [managed-node9] => (item=systemd-cryptsetup@luks\x2d650cacf1\x2db078\x2d496c\x2d9d32\x2dc0b5fdf3584c.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d650cacf1\\x2db078\\x2d496c\\x2d9d32\\x2dc0b5fdf3584c.service", "name": "systemd-cryptsetup@luks\\x2d650cacf1\\x2db078\\x2d496c\\x2d9d32\\x2dc0b5fdf3584c.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "system-systemd\\x2dcryptsetup.slice systemd-journald.socket dev-sda.device cryptsetup-pre.target", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "umount.target cryptsetup.target", "BindsTo": "dev-sda.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-650cacf1-b078-496c-9d32-c0b5fdf3584c", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-650cacf1-b078-496c-9d32-c0b5fdf3584c /dev/sda - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-650cacf1-b078-496c-9d32-c0b5fdf3584c ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d650cacf1\\x2db078\\x2d496c\\x2d9d32\\x2dc0b5fdf3584c.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2d650cacf1\\x2db078\\x2d496c\\x2d9d32\\x2dc0b5fdf3584c.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2d650cacf1\\x2db078\\x2d496c\\x2d9d32\\x2dc0b5fdf3584c.service", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Sat 2026-03-21 19:00:48 EDT", "StateChangeTimestampMonotonic": "1811365807", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "infinity", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "WantedBy": "dev-sda.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node9] => (item=systemd-cryptsetup@luk...db078\x2d496c\x2d9d32\x2dc0b5fdf3584c.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...db078\\x2d496c\\x2d9d32\\x2dc0b5fdf3584c.service", "name": "systemd-cryptsetup@luk...db078\\x2d496c\\x2d9d32\\x2dc0b5fdf3584c.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...db078\\x2d496c\\x2d9d32\\x2dc0b5fdf3584c.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...db078\\x2d496c\\x2d9d32\\x2dc0b5fdf3584c.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.NoSuchUnit \"Unit systemd-cryptsetup@luk...db078\\x2d496c\\x2d9d32\\x2dc0b5fdf3584c.service not found.\"", "LoadState": "not-found", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...db078\\x2d496c\\x2d9d32\\x2dc0b5fdf3584c.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Saturday 21 March 2026 19:01:55 -0400 (0:00:03.242) 0:05:06.267 ******** fatal: [managed-node9]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'sda' in safe mode due to adding encryption TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129 Saturday 21 March 2026 19:02:00 -0400 (0:00:04.731) 0:05:10.999 ******** fatal: [managed-node9]: FAILED! => { "changed": false } MSG: {'msg': "cannot remove existing formatting on device 'sda' in safe mode due to adding encryption", 'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'invocation': {'module_args': {'pools': [], 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'foo', 'raid_level': None, 'size': 10737418240, 'state': 'present', 'type': 'disk', 'disks': ['sda'], 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_stripe_size': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': True, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Saturday 21 March 2026 19:02:00 -0400 (0:00:00.081) 0:05:11.081 ******** changed: [managed-node9] => (item=systemd-cryptsetup@luks\x2d650cacf1\x2db078\x2d496c\x2d9d32\x2dc0b5fdf3584c.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d650cacf1\\x2db078\\x2d496c\\x2d9d32\\x2dc0b5fdf3584c.service", "name": "systemd-cryptsetup@luks\\x2d650cacf1\\x2db078\\x2d496c\\x2d9d32\\x2dc0b5fdf3584c.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d650cacf1\\x2db078\\x2d496c\\x2d9d32\\x2dc0b5fdf3584c.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2d650cacf1\\x2db078\\x2d496c\\x2d9d32\\x2dc0b5fdf3584c.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2d650cacf1\\x2db078\\x2d496c\\x2d9d32\\x2dc0b5fdf3584c.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2d650cacf1\\x2db078\\x2d496c\\x2d9d32\\x2dc0b5fdf3584c.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node9] => (item=systemd-cryptsetup@luk...db078\x2d496c\x2d9d32\x2dc0b5fdf3584c.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...db078\\x2d496c\\x2d9d32\\x2dc0b5fdf3584c.service", "name": "systemd-cryptsetup@luk...db078\\x2d496c\\x2d9d32\\x2dc0b5fdf3584c.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...db078\\x2d496c\\x2d9d32\\x2dc0b5fdf3584c.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...db078\\x2d496c\\x2d9d32\\x2dc0b5fdf3584c.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luk...db078\\x2d496c\\x2d9d32\\x2dc0b5fdf3584c.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...db078\\x2d496c\\x2d9d32\\x2dc0b5fdf3584c.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:22 Saturday 21 March 2026 19:02:02 -0400 (0:00:01.987) 0:05:13.068 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:27 Saturday 21 March 2026 19:02:02 -0400 (0:00:00.047) 0:05:13.116 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:38 Saturday 21 March 2026 19:02:02 -0400 (0:00:00.220) 0:05:13.336 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Saturday 21 March 2026 19:02:02 -0400 (0:00:00.070) 0:05:13.406 ******** ok: [managed-node9] => { "changed": false, "stat": { "atime": 1774134087.5374947, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1774134087.5374947, "dev": 2048, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1774134087.5374947, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "2338078971", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Saturday 21 March 2026 19:02:03 -0400 (0:00:00.775) 0:05:14.182 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Add encryption to the volume] ******************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:174 Saturday 21 March 2026 19:02:03 -0400 (0:00:00.099) 0:05:14.282 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node9 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:24 Saturday 21 March 2026 19:02:03 -0400 (0:00:00.248) 0:05:14.530 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:34 Saturday 21 March 2026 19:02:03 -0400 (0:00:00.114) 0:05:14.645 ******** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Saturday 21 March 2026 19:02:04 -0400 (0:00:00.230) 0:05:14.876 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Saturday 21 March 2026 19:02:04 -0400 (0:00:00.138) 0:05:15.014 ******** ok: [managed-node9] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Saturday 21 March 2026 19:02:06 -0400 (0:00:01.838) 0:05:16.853 ******** skipping: [managed-node9] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node9] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node9] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node9] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Saturday 21 March 2026 19:02:06 -0400 (0:00:00.352) 0:05:17.205 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Saturday 21 March 2026 19:02:06 -0400 (0:00:00.329) 0:05:17.534 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Saturday 21 March 2026 19:02:06 -0400 (0:00:00.172) 0:05:17.706 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Saturday 21 March 2026 19:02:07 -0400 (0:00:00.143) 0:05:17.850 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Saturday 21 March 2026 19:02:07 -0400 (0:00:00.121) 0:05:17.971 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Saturday 21 March 2026 19:02:07 -0400 (0:00:00.266) 0:05:18.238 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Saturday 21 March 2026 19:02:07 -0400 (0:00:00.234) 0:05:18.473 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Saturday 21 March 2026 19:02:07 -0400 (0:00:00.182) 0:05:18.656 ******** ok: [managed-node9] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kmod-kvdo libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python3-blivet stratis-cli stratisd vdo xfsprogs TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Saturday 21 March 2026 19:02:11 -0400 (0:00:03.603) 0:05:22.259 ******** ok: [managed-node9] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Saturday 21 March 2026 19:02:11 -0400 (0:00:00.185) 0:05:22.445 ******** ok: [managed-node9] => { "storage_volumes | d([])": [ { "disks": [ "sda" ], "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Saturday 21 March 2026 19:02:11 -0400 (0:00:00.094) 0:05:22.539 ******** ok: [managed-node9] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Saturday 21 March 2026 19:02:16 -0400 (0:00:04.853) 0:05:27.393 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Saturday 21 March 2026 19:02:16 -0400 (0:00:00.195) 0:05:27.588 ******** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Saturday 21 March 2026 19:02:16 -0400 (0:00:00.103) 0:05:27.691 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Saturday 21 March 2026 19:02:17 -0400 (0:00:00.245) 0:05:27.937 ******** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Saturday 21 March 2026 19:02:17 -0400 (0:00:00.105) 0:05:28.042 ******** ok: [managed-node9] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup kpartx TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Saturday 21 March 2026 19:02:20 -0400 (0:00:03.653) 0:05:31.696 ******** ok: [managed-node9] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Saturday 21 March 2026 19:02:23 -0400 (0:00:02.692) 0:05:34.389 ******** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Saturday 21 March 2026 19:02:23 -0400 (0:00:00.202) 0:05:34.592 ******** changed: [managed-node9] => { "actions": [ { "action": "destroy format", "device": "/dev/sda", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-05196f86-3e12-4b23-b22c-c1ad094a5b0d", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-05196f86-3e12-4b23-b22c-c1ad094a5b0d", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-05196f86-3e12-4b23-b22c-c1ad094a5b0d", "password": "-", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-05196f86-3e12-4b23-b22c-c1ad094a5b0d" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=852fa207-9067-4e60-9dbb-4d9de692146f", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-05196f86-3e12-4b23-b22c-c1ad094a5b0d", "state": "mounted" } ], "packages": [ "xfsprogs", "cryptsetup" ], "pools": [], "volumes": [ { "_device": "/dev/mapper/luks-05196f86-3e12-4b23-b22c-c1ad094a5b0d", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-05196f86-3e12-4b23-b22c-c1ad094a5b0d", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:103 Saturday 21 March 2026 19:02:37 -0400 (0:00:13.361) 0:05:47.953 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:110 Saturday 21 March 2026 19:02:37 -0400 (0:00:00.218) 0:05:48.172 ******** ok: [managed-node9] => { "changed": false, "stat": { "atime": 1774134046.3902595, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "a78774f3957a511029f4eb0cbb693bced77b1619", "ctime": 1774134046.3872595, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 364904581, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1774134046.3872595, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1413, "uid": 0, "version": "133446622", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Saturday 21 March 2026 19:02:38 -0400 (0:00:01.418) 0:05:49.591 ******** ok: [managed-node9] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Saturday 21 March 2026 19:02:40 -0400 (0:00:01.709) 0:05:51.300 ******** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:139 Saturday 21 March 2026 19:02:40 -0400 (0:00:00.361) 0:05:51.661 ******** ok: [managed-node9] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/sda", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-05196f86-3e12-4b23-b22c-c1ad094a5b0d", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-05196f86-3e12-4b23-b22c-c1ad094a5b0d", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-05196f86-3e12-4b23-b22c-c1ad094a5b0d", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-05196f86-3e12-4b23-b22c-c1ad094a5b0d" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=852fa207-9067-4e60-9dbb-4d9de692146f", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-05196f86-3e12-4b23-b22c-c1ad094a5b0d", "state": "mounted" } ], "packages": [ "xfsprogs", "cryptsetup" ], "pools": [], "volumes": [ { "_device": "/dev/mapper/luks-05196f86-3e12-4b23-b22c-c1ad094a5b0d", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-05196f86-3e12-4b23-b22c-c1ad094a5b0d", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Saturday 21 March 2026 19:02:41 -0400 (0:00:00.284) 0:05:51.946 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:152 Saturday 21 March 2026 19:02:41 -0400 (0:00:00.303) 0:05:52.249 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/mapper/luks-05196f86-3e12-4b23-b22c-c1ad094a5b0d", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-05196f86-3e12-4b23-b22c-c1ad094a5b0d", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:168 Saturday 21 March 2026 19:02:41 -0400 (0:00:00.207) 0:05:52.457 ******** changed: [managed-node9] => (item={'src': 'UUID=852fa207-9067-4e60-9dbb-4d9de692146f', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=852fa207-9067-4e60-9dbb-4d9de692146f", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=852fa207-9067-4e60-9dbb-4d9de692146f" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:179 Saturday 21 March 2026 19:02:43 -0400 (0:00:01.681) 0:05:54.138 ******** ok: [managed-node9] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:184 Saturday 21 March 2026 19:02:45 -0400 (0:00:02.028) 0:05:56.167 ******** changed: [managed-node9] => (item={'src': '/dev/mapper/luks-05196f86-3e12-4b23-b22c-c1ad094a5b0d', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-05196f86-3e12-4b23-b22c-c1ad094a5b0d", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-05196f86-3e12-4b23-b22c-c1ad094a5b0d" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Saturday 21 March 2026 19:02:47 -0400 (0:00:01.709) 0:05:57.876 ******** skipping: [managed-node9] => (item={'src': '/dev/mapper/luks-05196f86-3e12-4b23-b22c-c1ad094a5b0d', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-05196f86-3e12-4b23-b22c-c1ad094a5b0d", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:207 Saturday 21 March 2026 19:02:47 -0400 (0:00:00.195) 0:05:58.072 ******** ok: [managed-node9] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:215 Saturday 21 March 2026 19:02:49 -0400 (0:00:01.970) 0:06:00.043 ******** ok: [managed-node9] => { "changed": false, "stat": { "atime": 1774134058.929331, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1774134051.4842885, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 390070411, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1774134051.4832885, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "1417030221", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:220 Saturday 21 March 2026 19:02:51 -0400 (0:00:01.874) 0:06:01.917 ******** changed: [managed-node9] => (item={'backing_device': '/dev/sda', 'name': 'luks-05196f86-3e12-4b23-b22c-c1ad094a5b0d', 'password': '-', 'state': 'present'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda", "name": "luks-05196f86-3e12-4b23-b22c-c1ad094a5b0d", "password": "-", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:242 Saturday 21 March 2026 19:02:52 -0400 (0:00:01.530) 0:06:03.447 ******** ok: [managed-node9] TASK [Verify role results - 3] ************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:186 Saturday 21 March 2026 19:02:54 -0400 (0:00:02.020) 0:06:05.467 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node9 TASK [Print out pool information] ********************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Saturday 21 March 2026 19:02:55 -0400 (0:00:00.420) 0:06:05.888 ******** skipping: [managed-node9] => {} TASK [Print out volume information] ******************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Saturday 21 March 2026 19:02:55 -0400 (0:00:00.188) 0:06:06.077 ******** ok: [managed-node9] => { "_storage_volumes_list": [ { "_device": "/dev/mapper/luks-05196f86-3e12-4b23-b22c-c1ad094a5b0d", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-05196f86-3e12-4b23-b22c-c1ad094a5b0d", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Saturday 21 March 2026 19:02:55 -0400 (0:00:00.397) 0:06:06.475 ******** ok: [managed-node9] => { "changed": false, "info": { "/dev/mapper/luks-05196f86-3e12-4b23-b22c-c1ad094a5b0d": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-05196f86-3e12-4b23-b22c-c1ad094a5b0d", "size": "10G", "type": "crypt", "uuid": "22b2a6eb-1704-442f-8a11-4254691f414d" }, "/dev/sda": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "05196f86-3e12-4b23-b22c-c1ad094a5b0d" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Saturday 21 March 2026 19:02:57 -0400 (0:00:01.521) 0:06:07.996 ******** ok: [managed-node9] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002522", "end": "2026-03-21 19:02:58.348828", "rc": 0, "start": "2026-03-21 19:02:58.346306" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-05196f86-3e12-4b23-b22c-c1ad094a5b0d /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Saturday 21 March 2026 19:02:58 -0400 (0:00:01.442) 0:06:09.439 ******** ok: [managed-node9] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002681", "end": "2026-03-21 19:02:59.821766", "failed_when_result": false, "rc": 0, "start": "2026-03-21 19:02:59.819085" } STDOUT: luks-05196f86-3e12-4b23-b22c-c1ad094a5b0d /dev/sda - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Saturday 21 March 2026 19:03:00 -0400 (0:00:01.544) 0:06:10.983 ******** TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Saturday 21 March 2026 19:03:00 -0400 (0:00:00.202) 0:06:11.186 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node9 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Saturday 21 March 2026 19:03:00 -0400 (0:00:00.430) 0:06:11.616 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Saturday 21 March 2026 19:03:00 -0400 (0:00:00.205) 0:06:11.821 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node9 included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node9 included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node9 included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node9 included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node9 included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node9 included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node9 included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node9 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Saturday 21 March 2026 19:03:02 -0400 (0:00:01.024) 0:06:12.846 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-05196f86-3e12-4b23-b22c-c1ad094a5b0d" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Saturday 21 March 2026 19:03:02 -0400 (0:00:00.301) 0:06:13.148 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Saturday 21 March 2026 19:03:02 -0400 (0:00:00.234) 0:06:13.382 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Saturday 21 March 2026 19:03:02 -0400 (0:00:00.396) 0:06:13.779 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Saturday 21 March 2026 19:03:03 -0400 (0:00:00.236) 0:06:14.016 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Saturday 21 March 2026 19:03:03 -0400 (0:00:00.287) 0:06:14.304 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Saturday 21 March 2026 19:03:03 -0400 (0:00:00.235) 0:06:14.539 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Saturday 21 March 2026 19:03:03 -0400 (0:00:00.263) 0:06:14.803 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Saturday 21 March 2026 19:03:04 -0400 (0:00:00.216) 0:06:15.019 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Saturday 21 March 2026 19:03:04 -0400 (0:00:00.263) 0:06:15.282 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Saturday 21 March 2026 19:03:04 -0400 (0:00:00.256) 0:06:15.538 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Saturday 21 March 2026 19:03:05 -0400 (0:00:00.388) 0:06:15.927 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-05196f86-3e12-4b23-b22c-c1ad094a5b0d " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Saturday 21 March 2026 19:03:05 -0400 (0:00:00.497) 0:06:16.424 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Saturday 21 March 2026 19:03:05 -0400 (0:00:00.291) 0:06:16.716 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Saturday 21 March 2026 19:03:06 -0400 (0:00:00.239) 0:06:16.955 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Saturday 21 March 2026 19:03:06 -0400 (0:00:00.210) 0:06:17.165 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Saturday 21 March 2026 19:03:06 -0400 (0:00:00.237) 0:06:17.403 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Saturday 21 March 2026 19:03:06 -0400 (0:00:00.244) 0:06:17.648 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Saturday 21 March 2026 19:03:07 -0400 (0:00:00.344) 0:06:17.992 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Saturday 21 March 2026 19:03:07 -0400 (0:00:00.336) 0:06:18.329 ******** ok: [managed-node9] => { "changed": false, "stat": { "atime": 1774134156.6858914, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1774134156.6858914, "dev": 6, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 36050, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1774134156.6858914, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Saturday 21 March 2026 19:03:08 -0400 (0:00:01.368) 0:06:19.697 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Saturday 21 March 2026 19:03:09 -0400 (0:00:00.200) 0:06:19.898 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Saturday 21 March 2026 19:03:09 -0400 (0:00:00.164) 0:06:20.062 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Saturday 21 March 2026 19:03:09 -0400 (0:00:00.307) 0:06:20.370 ******** ok: [managed-node9] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Saturday 21 March 2026 19:03:09 -0400 (0:00:00.256) 0:06:20.627 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Saturday 21 March 2026 19:03:09 -0400 (0:00:00.159) 0:06:20.787 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Saturday 21 March 2026 19:03:10 -0400 (0:00:00.176) 0:06:20.963 ******** ok: [managed-node9] => { "changed": false, "stat": { "atime": 1774134156.8358924, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1774134156.8358924, "dev": 6, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 201770, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1774134156.8358924, "nlink": 1, "path": "/dev/mapper/luks-05196f86-3e12-4b23-b22c-c1ad094a5b0d", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Saturday 21 March 2026 19:03:11 -0400 (0:00:01.472) 0:06:22.435 ******** ok: [managed-node9] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Saturday 21 March 2026 19:03:15 -0400 (0:00:03.753) 0:06:26.189 ******** ok: [managed-node9] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/sda" ], "delta": "0:00:00.010122", "end": "2026-03-21 19:03:16.386799", "rc": 0, "start": "2026-03-21 19:03:16.376677" } STDOUT: LUKS header information Version: 2 Epoch: 3 Metadata area: 16384 [bytes] Keyslots area: 16744448 [bytes] UUID: 05196f86-3e12-4b23-b22c-c1ad094a5b0d Label: (no label) Subsystem: (no subsystem) Flags: (no flags) Data segments: 0: crypt offset: 16777216 [bytes] length: (whole device) cipher: aes-xts-plain64 sector: 512 [bytes] Keyslots: 0: luks2 Key: 512 bits Priority: normal Cipher: aes-xts-plain64 Cipher key: 512 bits PBKDF: argon2i Time cost: 4 Memory: 917562 Threads: 2 Salt: 36 72 40 5a ad f4 8a ef ed 13 19 8b 34 f9 fd 5b 79 c7 f4 cd 74 04 44 12 c4 0c b8 5b 51 ef 2d 25 AF stripes: 4000 AF hash: sha256 Area offset:32768 [bytes] Area length:258048 [bytes] Digest ID: 0 Tokens: Digests: 0: pbkdf2 Hash: sha256 Iterations: 120470 Salt: ef bd da 74 19 36 35 c4 94 6b f2 cb 91 34 72 eb 60 c3 75 07 2c a6 f6 53 4e d4 23 7a 9f 4c bf 09 Digest: 53 99 49 de b1 4b 4b af d1 d0 6f 34 8d 35 0b 24 c3 c6 0d 79 35 bf 24 a4 46 d4 66 19 2a 33 49 40 TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Saturday 21 March 2026 19:03:16 -0400 (0:00:01.298) 0:06:27.488 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Saturday 21 March 2026 19:03:16 -0400 (0:00:00.232) 0:06:27.720 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Saturday 21 March 2026 19:03:17 -0400 (0:00:00.151) 0:06:27.872 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Saturday 21 March 2026 19:03:17 -0400 (0:00:00.281) 0:06:28.153 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Saturday 21 March 2026 19:03:17 -0400 (0:00:00.187) 0:06:28.341 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Saturday 21 March 2026 19:03:17 -0400 (0:00:00.184) 0:06:28.525 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Saturday 21 March 2026 19:03:17 -0400 (0:00:00.273) 0:06:28.798 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Saturday 21 March 2026 19:03:18 -0400 (0:00:00.241) 0:06:29.040 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-05196f86-3e12-4b23-b22c-c1ad094a5b0d /dev/sda -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Saturday 21 March 2026 19:03:18 -0400 (0:00:00.316) 0:06:29.356 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Saturday 21 March 2026 19:03:18 -0400 (0:00:00.213) 0:06:29.570 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Saturday 21 March 2026 19:03:18 -0400 (0:00:00.263) 0:06:29.833 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Saturday 21 March 2026 19:03:19 -0400 (0:00:00.257) 0:06:30.091 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Saturday 21 March 2026 19:03:20 -0400 (0:00:00.951) 0:06:31.042 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Saturday 21 March 2026 19:03:20 -0400 (0:00:00.186) 0:06:31.229 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Saturday 21 March 2026 19:03:20 -0400 (0:00:00.206) 0:06:31.435 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Saturday 21 March 2026 19:03:20 -0400 (0:00:00.243) 0:06:31.679 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Saturday 21 March 2026 19:03:21 -0400 (0:00:00.196) 0:06:31.875 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Saturday 21 March 2026 19:03:21 -0400 (0:00:00.246) 0:06:32.122 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Saturday 21 March 2026 19:03:21 -0400 (0:00:00.131) 0:06:32.254 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Saturday 21 March 2026 19:03:21 -0400 (0:00:00.278) 0:06:32.532 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Saturday 21 March 2026 19:03:21 -0400 (0:00:00.239) 0:06:32.772 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Saturday 21 March 2026 19:03:22 -0400 (0:00:00.238) 0:06:33.010 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Saturday 21 March 2026 19:03:22 -0400 (0:00:00.194) 0:06:33.204 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Saturday 21 March 2026 19:03:22 -0400 (0:00:00.162) 0:06:33.367 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Saturday 21 March 2026 19:03:22 -0400 (0:00:00.203) 0:06:33.571 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Saturday 21 March 2026 19:03:23 -0400 (0:00:00.306) 0:06:33.877 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Saturday 21 March 2026 19:03:23 -0400 (0:00:00.220) 0:06:34.098 ******** ok: [managed-node9] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Saturday 21 March 2026 19:03:23 -0400 (0:00:00.181) 0:06:34.279 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Saturday 21 March 2026 19:03:23 -0400 (0:00:00.243) 0:06:34.523 ******** skipping: [managed-node9] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Saturday 21 March 2026 19:03:23 -0400 (0:00:00.268) 0:06:34.791 ******** skipping: [managed-node9] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Saturday 21 March 2026 19:03:24 -0400 (0:00:00.289) 0:06:35.081 ******** skipping: [managed-node9] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Saturday 21 March 2026 19:03:24 -0400 (0:00:00.295) 0:06:35.376 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Saturday 21 March 2026 19:03:24 -0400 (0:00:00.273) 0:06:35.649 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Saturday 21 March 2026 19:03:25 -0400 (0:00:00.265) 0:06:35.915 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Saturday 21 March 2026 19:03:25 -0400 (0:00:00.257) 0:06:36.172 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Saturday 21 March 2026 19:03:25 -0400 (0:00:00.280) 0:06:36.452 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Saturday 21 March 2026 19:03:25 -0400 (0:00:00.325) 0:06:36.778 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Saturday 21 March 2026 19:03:26 -0400 (0:00:00.246) 0:06:37.025 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Saturday 21 March 2026 19:03:26 -0400 (0:00:00.392) 0:06:37.417 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Saturday 21 March 2026 19:03:26 -0400 (0:00:00.238) 0:06:37.655 ******** skipping: [managed-node9] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Saturday 21 March 2026 19:03:27 -0400 (0:00:00.373) 0:06:38.029 ******** skipping: [managed-node9] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Saturday 21 March 2026 19:03:27 -0400 (0:00:00.273) 0:06:38.302 ******** skipping: [managed-node9] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Saturday 21 March 2026 19:03:27 -0400 (0:00:00.264) 0:06:38.567 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Saturday 21 March 2026 19:03:27 -0400 (0:00:00.213) 0:06:38.781 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Saturday 21 March 2026 19:03:28 -0400 (0:00:00.172) 0:06:38.953 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Saturday 21 March 2026 19:03:28 -0400 (0:00:00.261) 0:06:39.215 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Saturday 21 March 2026 19:03:28 -0400 (0:00:00.212) 0:06:39.427 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Saturday 21 March 2026 19:03:28 -0400 (0:00:00.296) 0:06:39.723 ******** ok: [managed-node9] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Saturday 21 March 2026 19:03:29 -0400 (0:00:00.125) 0:06:39.848 ******** ok: [managed-node9] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Saturday 21 March 2026 19:03:29 -0400 (0:00:00.166) 0:06:40.015 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Saturday 21 March 2026 19:03:29 -0400 (0:00:00.234) 0:06:40.249 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Saturday 21 March 2026 19:03:29 -0400 (0:00:00.165) 0:06:40.415 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Saturday 21 March 2026 19:03:29 -0400 (0:00:00.191) 0:06:40.607 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Saturday 21 March 2026 19:03:30 -0400 (0:00:00.280) 0:06:40.887 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Saturday 21 March 2026 19:03:30 -0400 (0:00:00.276) 0:06:41.164 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Saturday 21 March 2026 19:03:30 -0400 (0:00:00.205) 0:06:41.369 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Saturday 21 March 2026 19:03:30 -0400 (0:00:00.170) 0:06:41.540 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Saturday 21 March 2026 19:03:30 -0400 (0:00:00.233) 0:06:41.774 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Saturday 21 March 2026 19:03:31 -0400 (0:00:00.145) 0:06:41.920 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Test for correct handling of new encrypted volume w/ no key - 2] ********* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:193 Saturday 21 March 2026 19:03:31 -0400 (0:00:00.232) 0:06:42.152 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node9 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Saturday 21 March 2026 19:03:31 -0400 (0:00:00.520) 0:06:42.673 ******** ok: [managed-node9] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Saturday 21 March 2026 19:03:32 -0400 (0:00:00.284) 0:06:42.957 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node9 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:24 Saturday 21 March 2026 19:03:32 -0400 (0:00:00.235) 0:06:43.193 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:34 Saturday 21 March 2026 19:03:32 -0400 (0:00:00.211) 0:06:43.404 ******** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Saturday 21 March 2026 19:03:32 -0400 (0:00:00.258) 0:06:43.663 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Saturday 21 March 2026 19:03:32 -0400 (0:00:00.155) 0:06:43.818 ******** ok: [managed-node9] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Saturday 21 March 2026 19:03:34 -0400 (0:00:01.951) 0:06:45.770 ******** skipping: [managed-node9] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node9] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node9] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node9] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Saturday 21 March 2026 19:03:35 -0400 (0:00:00.450) 0:06:46.221 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Saturday 21 March 2026 19:03:35 -0400 (0:00:00.151) 0:06:46.372 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Saturday 21 March 2026 19:03:35 -0400 (0:00:00.190) 0:06:46.562 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Saturday 21 March 2026 19:03:35 -0400 (0:00:00.135) 0:06:46.698 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Saturday 21 March 2026 19:03:35 -0400 (0:00:00.116) 0:06:46.814 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Saturday 21 March 2026 19:03:36 -0400 (0:00:00.368) 0:06:47.183 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Saturday 21 March 2026 19:03:36 -0400 (0:00:00.175) 0:06:47.358 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Saturday 21 March 2026 19:03:36 -0400 (0:00:00.173) 0:06:47.532 ******** ok: [managed-node9] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kmod-kvdo libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python3-blivet stratis-cli stratisd vdo xfsprogs TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Saturday 21 March 2026 19:03:40 -0400 (0:00:04.238) 0:06:51.771 ******** ok: [managed-node9] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": true, "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Saturday 21 March 2026 19:03:41 -0400 (0:00:00.259) 0:06:52.030 ******** ok: [managed-node9] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Saturday 21 March 2026 19:03:41 -0400 (0:00:00.207) 0:06:52.238 ******** ok: [managed-node9] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Saturday 21 March 2026 19:03:46 -0400 (0:00:05.074) 0:06:57.312 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Saturday 21 March 2026 19:03:46 -0400 (0:00:00.303) 0:06:57.616 ******** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Saturday 21 March 2026 19:03:46 -0400 (0:00:00.126) 0:06:57.742 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Saturday 21 March 2026 19:03:47 -0400 (0:00:00.198) 0:06:57.941 ******** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Saturday 21 March 2026 19:03:47 -0400 (0:00:00.130) 0:06:58.072 ******** ok: [managed-node9] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup kpartx TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Saturday 21 March 2026 19:03:51 -0400 (0:00:04.188) 0:07:02.261 ******** ok: [managed-node9] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Saturday 21 March 2026 19:03:54 -0400 (0:00:03.099) 0:07:05.361 ******** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Saturday 21 March 2026 19:03:54 -0400 (0:00:00.358) 0:07:05.719 ******** fatal: [managed-node9]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: encrypted volume 'test1' missing key/password TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129 Saturday 21 March 2026 19:04:00 -0400 (0:00:05.372) 0:07:11.092 ******** fatal: [managed-node9]: FAILED! => { "changed": false } MSG: {'msg': "encrypted volume 'test1' missing key/password", 'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'invocation': {'module_args': {'pools': [{'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'partition', 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None}]}], 'volumes': [], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': False, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Saturday 21 March 2026 19:04:00 -0400 (0:00:00.351) 0:07:11.444 ******** TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:22 Saturday 21 March 2026 19:04:01 -0400 (0:00:00.428) 0:07:11.873 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:27 Saturday 21 March 2026 19:04:01 -0400 (0:00:00.171) 0:07:12.044 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:38 Saturday 21 March 2026 19:04:01 -0400 (0:00:00.234) 0:07:12.279 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Create an encrypted partition volume w/ default fs] ********************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:212 Saturday 21 March 2026 19:04:01 -0400 (0:00:00.134) 0:07:12.414 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node9 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:24 Saturday 21 March 2026 19:04:01 -0400 (0:00:00.319) 0:07:12.734 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:34 Saturday 21 March 2026 19:04:02 -0400 (0:00:00.157) 0:07:12.891 ******** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Saturday 21 March 2026 19:04:02 -0400 (0:00:00.097) 0:07:12.989 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Saturday 21 March 2026 19:04:02 -0400 (0:00:00.144) 0:07:13.133 ******** ok: [managed-node9] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Saturday 21 March 2026 19:04:04 -0400 (0:00:02.159) 0:07:15.293 ******** skipping: [managed-node9] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node9] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node9] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node9] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Saturday 21 March 2026 19:04:04 -0400 (0:00:00.306) 0:07:15.599 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Saturday 21 March 2026 19:04:04 -0400 (0:00:00.213) 0:07:15.813 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Saturday 21 March 2026 19:04:05 -0400 (0:00:00.236) 0:07:16.049 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Saturday 21 March 2026 19:04:05 -0400 (0:00:00.113) 0:07:16.163 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Saturday 21 March 2026 19:04:05 -0400 (0:00:00.174) 0:07:16.337 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Saturday 21 March 2026 19:04:05 -0400 (0:00:00.448) 0:07:16.786 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Saturday 21 March 2026 19:04:06 -0400 (0:00:00.229) 0:07:17.016 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Saturday 21 March 2026 19:04:06 -0400 (0:00:00.311) 0:07:17.328 ******** ok: [managed-node9] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kmod-kvdo libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python3-blivet stratis-cli stratisd vdo xfsprogs TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Saturday 21 March 2026 19:04:10 -0400 (0:00:04.098) 0:07:21.426 ******** ok: [managed-node9] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Saturday 21 March 2026 19:04:10 -0400 (0:00:00.169) 0:07:21.596 ******** ok: [managed-node9] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Saturday 21 March 2026 19:04:10 -0400 (0:00:00.231) 0:07:21.827 ******** ok: [managed-node9] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Saturday 21 March 2026 19:04:16 -0400 (0:00:05.221) 0:07:27.049 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Saturday 21 March 2026 19:04:16 -0400 (0:00:00.290) 0:07:27.339 ******** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Saturday 21 March 2026 19:04:16 -0400 (0:00:00.119) 0:07:27.459 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Saturday 21 March 2026 19:04:16 -0400 (0:00:00.150) 0:07:27.610 ******** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Saturday 21 March 2026 19:04:16 -0400 (0:00:00.135) 0:07:27.745 ******** ok: [managed-node9] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup kpartx TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Saturday 21 March 2026 19:04:21 -0400 (0:00:04.301) 0:07:32.046 ******** ok: [managed-node9] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Saturday 21 March 2026 19:04:24 -0400 (0:00:02.912) 0:07:34.959 ******** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Saturday 21 March 2026 19:04:24 -0400 (0:00:00.279) 0:07:35.238 ******** changed: [managed-node9] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-05196f86-3e12-4b23-b22c-c1ad094a5b0d", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-05196f86-3e12-4b23-b22c-c1ad094a5b0d", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sda1", "fs_type": null }, { "action": "create format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-c9479194-c226-4bd0-b788-f506f21ba738", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-c9479194-c226-4bd0-b788-f506f21ba738", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-05196f86-3e12-4b23-b22c-c1ad094a5b0d", "password": "-", "state": "absent" }, { "backing_device": "/dev/sda1", "name": "luks-c9479194-c226-4bd0-b788-f506f21ba738", "password": "-", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-c9479194-c226-4bd0-b788-f506f21ba738" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-05196f86-3e12-4b23-b22c-c1ad094a5b0d", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-c9479194-c226-4bd0-b788-f506f21ba738", "state": "mounted" } ], "packages": [ "xfsprogs", "cryptsetup" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-c9479194-c226-4bd0-b788-f506f21ba738", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-c9479194-c226-4bd0-b788-f506f21ba738", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:103 Saturday 21 March 2026 19:04:38 -0400 (0:00:14.228) 0:07:49.467 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:110 Saturday 21 March 2026 19:04:38 -0400 (0:00:00.215) 0:07:49.682 ******** ok: [managed-node9] => { "changed": false, "stat": { "atime": 1774134166.7219493, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "2ddcf60b34f0eada8598fe27bb989041e703e4ff", "ctime": 1774134166.7179494, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 364904581, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1774134166.7179494, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1425, "uid": 0, "version": "133446622", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Saturday 21 March 2026 19:04:40 -0400 (0:00:01.381) 0:07:51.064 ******** ok: [managed-node9] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Saturday 21 March 2026 19:04:41 -0400 (0:00:01.424) 0:07:52.488 ******** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:139 Saturday 21 March 2026 19:04:42 -0400 (0:00:00.375) 0:07:52.863 ******** ok: [managed-node9] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-05196f86-3e12-4b23-b22c-c1ad094a5b0d", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-05196f86-3e12-4b23-b22c-c1ad094a5b0d", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sda1", "fs_type": null }, { "action": "create format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-c9479194-c226-4bd0-b788-f506f21ba738", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-c9479194-c226-4bd0-b788-f506f21ba738", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-05196f86-3e12-4b23-b22c-c1ad094a5b0d", "password": "-", "state": "absent" }, { "backing_device": "/dev/sda1", "name": "luks-c9479194-c226-4bd0-b788-f506f21ba738", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-c9479194-c226-4bd0-b788-f506f21ba738" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-05196f86-3e12-4b23-b22c-c1ad094a5b0d", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-c9479194-c226-4bd0-b788-f506f21ba738", "state": "mounted" } ], "packages": [ "xfsprogs", "cryptsetup" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-c9479194-c226-4bd0-b788-f506f21ba738", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-c9479194-c226-4bd0-b788-f506f21ba738", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Saturday 21 March 2026 19:04:42 -0400 (0:00:00.177) 0:07:53.041 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-c9479194-c226-4bd0-b788-f506f21ba738", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-c9479194-c226-4bd0-b788-f506f21ba738", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:152 Saturday 21 March 2026 19:04:42 -0400 (0:00:00.134) 0:07:53.176 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:168 Saturday 21 March 2026 19:04:42 -0400 (0:00:00.280) 0:07:53.456 ******** changed: [managed-node9] => (item={'src': '/dev/mapper/luks-05196f86-3e12-4b23-b22c-c1ad094a5b0d', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-05196f86-3e12-4b23-b22c-c1ad094a5b0d", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-05196f86-3e12-4b23-b22c-c1ad094a5b0d" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:179 Saturday 21 March 2026 19:04:44 -0400 (0:00:01.565) 0:07:55.021 ******** ok: [managed-node9] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:184 Saturday 21 March 2026 19:04:46 -0400 (0:00:01.928) 0:07:56.950 ******** changed: [managed-node9] => (item={'src': '/dev/mapper/luks-c9479194-c226-4bd0-b788-f506f21ba738', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-c9479194-c226-4bd0-b788-f506f21ba738", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-c9479194-c226-4bd0-b788-f506f21ba738" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Saturday 21 March 2026 19:04:47 -0400 (0:00:01.391) 0:07:58.342 ******** skipping: [managed-node9] => (item={'src': '/dev/mapper/luks-c9479194-c226-4bd0-b788-f506f21ba738', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-c9479194-c226-4bd0-b788-f506f21ba738", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:207 Saturday 21 March 2026 19:04:47 -0400 (0:00:00.317) 0:07:58.660 ******** ok: [managed-node9] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:215 Saturday 21 March 2026 19:04:49 -0400 (0:00:01.958) 0:08:00.618 ******** ok: [managed-node9] => { "changed": false, "stat": { "atime": 1774134179.8210251, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "a65458bd3589ae0f9a9c0d1e1c0d3922539701e8", "ctime": 1774134172.3869822, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 883897, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1774134172.385982, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 53, "uid": 0, "version": "1169795774", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:220 Saturday 21 March 2026 19:04:51 -0400 (0:00:01.379) 0:08:01.998 ******** changed: [managed-node9] => (item={'backing_device': '/dev/sda', 'name': 'luks-05196f86-3e12-4b23-b22c-c1ad094a5b0d', 'password': '-', 'state': 'absent'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda", "name": "luks-05196f86-3e12-4b23-b22c-c1ad094a5b0d", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed changed: [managed-node9] => (item={'backing_device': '/dev/sda1', 'name': 'luks-c9479194-c226-4bd0-b788-f506f21ba738', 'password': '-', 'state': 'present'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda1", "name": "luks-c9479194-c226-4bd0-b788-f506f21ba738", "password": "-", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:242 Saturday 21 March 2026 19:04:53 -0400 (0:00:02.714) 0:08:04.712 ******** ok: [managed-node9] TASK [Verify role results - 4] ************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:228 Saturday 21 March 2026 19:04:55 -0400 (0:00:01.879) 0:08:06.591 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node9 TASK [Print out pool information] ********************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Saturday 21 March 2026 19:04:56 -0400 (0:00:00.461) 0:08:07.053 ******** ok: [managed-node9] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-c9479194-c226-4bd0-b788-f506f21ba738", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-c9479194-c226-4bd0-b788-f506f21ba738", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Saturday 21 March 2026 19:04:56 -0400 (0:00:00.226) 0:08:07.279 ******** skipping: [managed-node9] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Saturday 21 March 2026 19:04:56 -0400 (0:00:00.055) 0:08:07.335 ******** ok: [managed-node9] => { "changed": false, "info": { "/dev/mapper/luks-c9479194-c226-4bd0-b788-f506f21ba738": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-c9479194-c226-4bd0-b788-f506f21ba738", "size": "4G", "type": "crypt", "uuid": "a8016b94-4809-4010-a05c-4864e3de34fd" }, "/dev/sda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/sda1", "size": "4G", "type": "partition", "uuid": "c9479194-c226-4bd0-b788-f506f21ba738" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Saturday 21 March 2026 19:04:57 -0400 (0:00:01.030) 0:08:08.365 ******** ok: [managed-node9] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002658", "end": "2026-03-21 19:04:58.648857", "rc": 0, "start": "2026-03-21 19:04:58.646199" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-c9479194-c226-4bd0-b788-f506f21ba738 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Saturday 21 March 2026 19:04:58 -0400 (0:00:01.341) 0:08:09.707 ******** ok: [managed-node9] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003024", "end": "2026-03-21 19:04:59.844234", "failed_when_result": false, "rc": 0, "start": "2026-03-21 19:04:59.841210" } STDOUT: luks-c9479194-c226-4bd0-b788-f506f21ba738 /dev/sda1 - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Saturday 21 March 2026 19:05:00 -0400 (0:00:01.286) 0:08:10.993 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node9 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Saturday 21 March 2026 19:05:00 -0400 (0:00:00.448) 0:08:11.441 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Saturday 21 March 2026 19:05:00 -0400 (0:00:00.202) 0:08:11.643 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Saturday 21 March 2026 19:05:01 -0400 (0:00:00.243) 0:08:11.887 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Saturday 21 March 2026 19:05:01 -0400 (0:00:00.935) 0:08:12.822 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node9 included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node9 TASK [Set test variables] ****************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Saturday 21 March 2026 19:05:02 -0400 (0:00:00.616) 0:08:13.439 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Saturday 21 March 2026 19:05:02 -0400 (0:00:00.281) 0:08:13.720 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Saturday 21 March 2026 19:05:03 -0400 (0:00:00.316) 0:08:14.037 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Saturday 21 March 2026 19:05:03 -0400 (0:00:00.294) 0:08:14.331 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Saturday 21 March 2026 19:05:03 -0400 (0:00:00.337) 0:08:14.669 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Saturday 21 March 2026 19:05:04 -0400 (0:00:00.215) 0:08:14.884 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type - 2] ************************************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Saturday 21 March 2026 19:05:04 -0400 (0:00:00.186) 0:08:15.070 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type - 3] ************************************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Saturday 21 March 2026 19:05:04 -0400 (0:00:00.287) 0:08:15.358 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:55 Saturday 21 March 2026 19:05:04 -0400 (0:00:00.356) 0:08:15.714 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:68 Saturday 21 March 2026 19:05:05 -0400 (0:00:00.222) 0:08:15.937 ******** ok: [managed-node9] => { "changed": false, "failed_when_result": false, "rc": 1 } STDERR: Shared connection to 10.31.10.97 closed. MSG: non-zero return code TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:78 Saturday 21 March 2026 19:05:06 -0400 (0:00:01.708) 0:08:17.646 ******** TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:88 Saturday 21 March 2026 19:05:07 -0400 (0:00:00.231) 0:08:17.877 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node9 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Saturday 21 March 2026 19:05:07 -0400 (0:00:00.517) 0:08:18.394 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Saturday 21 March 2026 19:05:07 -0400 (0:00:00.220) 0:08:18.615 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Saturday 21 March 2026 19:05:08 -0400 (0:00:00.341) 0:08:18.956 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Saturday 21 March 2026 19:05:08 -0400 (0:00:00.323) 0:08:19.280 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Saturday 21 March 2026 19:05:08 -0400 (0:00:00.359) 0:08:19.639 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Saturday 21 March 2026 19:05:09 -0400 (0:00:00.353) 0:08:19.993 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Saturday 21 March 2026 19:05:09 -0400 (0:00:00.173) 0:08:20.167 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Saturday 21 March 2026 19:05:09 -0400 (0:00:00.213) 0:08:20.380 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Saturday 21 March 2026 19:05:09 -0400 (0:00:00.306) 0:08:20.687 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Saturday 21 March 2026 19:05:10 -0400 (0:00:00.321) 0:08:21.008 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Saturday 21 March 2026 19:05:10 -0400 (0:00:00.294) 0:08:21.303 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:91 Saturday 21 March 2026 19:05:10 -0400 (0:00:00.181) 0:08:21.484 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node9 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Saturday 21 March 2026 19:05:11 -0400 (0:00:00.489) 0:08:21.974 ******** skipping: [managed-node9] => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-c9479194-c226-4bd0-b788-f506f21ba738', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-c9479194-c226-4bd0-b788-f506f21ba738', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_lvmraid_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_lvmraid_volume": { "_device": "/dev/mapper/luks-c9479194-c226-4bd0-b788-f506f21ba738", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-c9479194-c226-4bd0-b788-f506f21ba738", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:94 Saturday 21 March 2026 19:05:11 -0400 (0:00:00.220) 0:08:22.194 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node9 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Saturday 21 March 2026 19:05:11 -0400 (0:00:00.514) 0:08:22.709 ******** skipping: [managed-node9] => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-c9479194-c226-4bd0-b788-f506f21ba738', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-c9479194-c226-4bd0-b788-f506f21ba738', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_thin_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_thin_volume": { "_device": "/dev/mapper/luks-c9479194-c226-4bd0-b788-f506f21ba738", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-c9479194-c226-4bd0-b788-f506f21ba738", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check member encryption] ************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:97 Saturday 21 March 2026 19:05:12 -0400 (0:00:00.370) 0:08:23.079 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node9 TASK [Set test variables] ****************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Saturday 21 March 2026 19:05:12 -0400 (0:00:00.516) 0:08:23.595 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Saturday 21 March 2026 19:05:13 -0400 (0:00:00.300) 0:08:23.896 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Saturday 21 March 2026 19:05:13 -0400 (0:00:00.260) 0:08:24.157 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Saturday 21 March 2026 19:05:13 -0400 (0:00:00.239) 0:08:24.397 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:100 Saturday 21 March 2026 19:05:13 -0400 (0:00:00.132) 0:08:24.530 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node9 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Saturday 21 March 2026 19:05:14 -0400 (0:00:00.347) 0:08:24.878 ******** skipping: [managed-node9] => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-c9479194-c226-4bd0-b788-f506f21ba738', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-c9479194-c226-4bd0-b788-f506f21ba738', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_vdo_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_vdo_volume": { "_device": "/dev/mapper/luks-c9479194-c226-4bd0-b788-f506f21ba738", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-c9479194-c226-4bd0-b788-f506f21ba738", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:103 Saturday 21 March 2026 19:05:14 -0400 (0:00:00.391) 0:08:25.269 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node9 TASK [Get stratis pool information] ******************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Saturday 21 March 2026 19:05:14 -0400 (0:00:00.518) 0:08:25.788 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print script output] ***************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Saturday 21 March 2026 19:05:15 -0400 (0:00:00.303) 0:08:26.092 ******** skipping: [managed-node9] => {} TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:19 Saturday 21 March 2026 19:05:15 -0400 (0:00:00.306) 0:08:26.398 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:23 Saturday 21 March 2026 19:05:15 -0400 (0:00:00.333) 0:08:26.732 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:30 Saturday 21 March 2026 19:05:16 -0400 (0:00:00.314) 0:08:27.046 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:39 Saturday 21 March 2026 19:05:16 -0400 (0:00:00.235) 0:08:27.282 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:49 Saturday 21 March 2026 19:05:16 -0400 (0:00:00.254) 0:08:27.536 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:106 Saturday 21 March 2026 19:05:16 -0400 (0:00:00.166) 0:08:27.703 ******** ok: [managed-node9] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Saturday 21 March 2026 19:05:17 -0400 (0:00:00.163) 0:08:27.867 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node9 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Saturday 21 March 2026 19:05:17 -0400 (0:00:00.327) 0:08:28.194 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Saturday 21 March 2026 19:05:17 -0400 (0:00:00.196) 0:08:28.391 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node9 included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node9 included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node9 included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node9 included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node9 included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node9 included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node9 included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node9 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Saturday 21 March 2026 19:05:18 -0400 (0:00:01.196) 0:08:29.587 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-c9479194-c226-4bd0-b788-f506f21ba738" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Saturday 21 March 2026 19:05:19 -0400 (0:00:00.301) 0:08:29.888 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Saturday 21 March 2026 19:05:19 -0400 (0:00:00.261) 0:08:30.150 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Saturday 21 March 2026 19:05:19 -0400 (0:00:00.293) 0:08:30.444 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Saturday 21 March 2026 19:05:19 -0400 (0:00:00.225) 0:08:30.670 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Saturday 21 March 2026 19:05:19 -0400 (0:00:00.169) 0:08:30.839 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Saturday 21 March 2026 19:05:20 -0400 (0:00:00.215) 0:08:31.055 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Saturday 21 March 2026 19:05:20 -0400 (0:00:00.177) 0:08:31.232 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Saturday 21 March 2026 19:05:20 -0400 (0:00:00.185) 0:08:31.417 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Saturday 21 March 2026 19:05:20 -0400 (0:00:00.275) 0:08:31.693 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Saturday 21 March 2026 19:05:21 -0400 (0:00:00.187) 0:08:31.881 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Saturday 21 March 2026 19:05:21 -0400 (0:00:00.159) 0:08:32.041 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-c9479194-c226-4bd0-b788-f506f21ba738 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Saturday 21 March 2026 19:05:21 -0400 (0:00:00.503) 0:08:32.544 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Saturday 21 March 2026 19:05:22 -0400 (0:00:00.368) 0:08:32.913 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Saturday 21 March 2026 19:05:22 -0400 (0:00:00.338) 0:08:33.251 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Saturday 21 March 2026 19:05:22 -0400 (0:00:00.299) 0:08:33.550 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Saturday 21 March 2026 19:05:23 -0400 (0:00:00.324) 0:08:33.875 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Saturday 21 March 2026 19:05:23 -0400 (0:00:00.266) 0:08:34.141 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Saturday 21 March 2026 19:05:23 -0400 (0:00:00.548) 0:08:34.690 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Saturday 21 March 2026 19:05:24 -0400 (0:00:00.367) 0:08:35.058 ******** ok: [managed-node9] => { "changed": false, "stat": { "atime": 1774134278.2105935, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1774134278.2105935, "dev": 6, "device_type": 2049, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 214896, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1774134278.2105935, "nlink": 1, "path": "/dev/sda1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Saturday 21 March 2026 19:05:26 -0400 (0:00:01.954) 0:08:37.012 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Saturday 21 March 2026 19:05:26 -0400 (0:00:00.289) 0:08:37.302 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Saturday 21 March 2026 19:05:26 -0400 (0:00:00.270) 0:08:37.572 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Saturday 21 March 2026 19:05:26 -0400 (0:00:00.223) 0:08:37.795 ******** ok: [managed-node9] => { "ansible_facts": { "st_volume_type": "partition" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Saturday 21 March 2026 19:05:27 -0400 (0:00:00.258) 0:08:38.054 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Saturday 21 March 2026 19:05:27 -0400 (0:00:00.259) 0:08:38.313 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Saturday 21 March 2026 19:05:27 -0400 (0:00:00.232) 0:08:38.545 ******** ok: [managed-node9] => { "changed": false, "stat": { "atime": 1774134278.3575943, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1774134278.3575943, "dev": 6, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 216176, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1774134278.3575943, "nlink": 1, "path": "/dev/mapper/luks-c9479194-c226-4bd0-b788-f506f21ba738", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Saturday 21 March 2026 19:05:29 -0400 (0:00:01.789) 0:08:40.335 ******** ok: [managed-node9] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Saturday 21 March 2026 19:05:33 -0400 (0:00:04.205) 0:08:44.541 ******** ok: [managed-node9] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/sda1" ], "delta": "0:00:00.010442", "end": "2026-03-21 19:05:34.775381", "rc": 0, "start": "2026-03-21 19:05:34.764939" } STDOUT: LUKS header information Version: 2 Epoch: 3 Metadata area: 16384 [bytes] Keyslots area: 16744448 [bytes] UUID: c9479194-c226-4bd0-b788-f506f21ba738 Label: (no label) Subsystem: (no subsystem) Flags: (no flags) Data segments: 0: crypt offset: 16777216 [bytes] length: (whole device) cipher: aes-xts-plain64 sector: 512 [bytes] Keyslots: 0: luks2 Key: 512 bits Priority: normal Cipher: aes-xts-plain64 Cipher key: 512 bits PBKDF: argon2i Time cost: 4 Memory: 914285 Threads: 2 Salt: c7 2a 54 93 5b b9 5d d8 4e 4d 63 6e dc 05 d8 da d4 af 16 f5 51 73 cf 83 be 0d 8b 52 2b c8 36 3c AF stripes: 4000 AF hash: sha256 Area offset:32768 [bytes] Area length:258048 [bytes] Digest ID: 0 Tokens: Digests: 0: pbkdf2 Hash: sha256 Iterations: 120249 Salt: c3 8e a7 92 42 8e 6b d7 1a 05 57 7c d1 7c 17 7d 21 0f 20 68 8a 5e a7 24 1f 47 68 77 74 ea 63 10 Digest: 42 95 c5 0b d3 3c a8 95 7a df cf 76 fa bd bf 9f cc 62 be 9b 31 57 18 ac a8 d6 33 d2 95 58 87 c3 TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Saturday 21 March 2026 19:05:34 -0400 (0:00:01.294) 0:08:45.835 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Saturday 21 March 2026 19:05:35 -0400 (0:00:00.238) 0:08:46.074 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Saturday 21 March 2026 19:05:35 -0400 (0:00:00.359) 0:08:46.434 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Saturday 21 March 2026 19:05:35 -0400 (0:00:00.293) 0:08:46.727 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Saturday 21 March 2026 19:05:36 -0400 (0:00:00.346) 0:08:47.074 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Saturday 21 March 2026 19:05:36 -0400 (0:00:00.269) 0:08:47.343 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Saturday 21 March 2026 19:05:36 -0400 (0:00:00.283) 0:08:47.627 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Saturday 21 March 2026 19:05:37 -0400 (0:00:00.361) 0:08:47.988 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-c9479194-c226-4bd0-b788-f506f21ba738 /dev/sda1 -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Saturday 21 March 2026 19:05:37 -0400 (0:00:00.517) 0:08:48.506 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Saturday 21 March 2026 19:05:37 -0400 (0:00:00.321) 0:08:48.828 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Saturday 21 March 2026 19:05:38 -0400 (0:00:00.352) 0:08:49.180 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Saturday 21 March 2026 19:05:38 -0400 (0:00:00.262) 0:08:49.443 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Saturday 21 March 2026 19:05:38 -0400 (0:00:00.269) 0:08:49.712 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Saturday 21 March 2026 19:05:39 -0400 (0:00:00.209) 0:08:49.921 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Saturday 21 March 2026 19:05:39 -0400 (0:00:00.222) 0:08:50.144 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Saturday 21 March 2026 19:05:39 -0400 (0:00:00.160) 0:08:50.305 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Saturday 21 March 2026 19:05:39 -0400 (0:00:00.214) 0:08:50.519 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Saturday 21 March 2026 19:05:39 -0400 (0:00:00.226) 0:08:50.745 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Saturday 21 March 2026 19:05:40 -0400 (0:00:00.232) 0:08:50.977 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Saturday 21 March 2026 19:05:40 -0400 (0:00:00.225) 0:08:51.203 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Saturday 21 March 2026 19:05:40 -0400 (0:00:00.237) 0:08:51.441 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Saturday 21 March 2026 19:05:40 -0400 (0:00:00.262) 0:08:51.703 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Saturday 21 March 2026 19:05:41 -0400 (0:00:00.273) 0:08:51.976 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Saturday 21 March 2026 19:05:41 -0400 (0:00:00.249) 0:08:52.226 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Saturday 21 March 2026 19:05:41 -0400 (0:00:00.187) 0:08:52.413 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Saturday 21 March 2026 19:05:41 -0400 (0:00:00.251) 0:08:52.665 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Saturday 21 March 2026 19:05:42 -0400 (0:00:00.225) 0:08:52.890 ******** ok: [managed-node9] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Saturday 21 March 2026 19:05:42 -0400 (0:00:00.180) 0:08:53.071 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Saturday 21 March 2026 19:05:42 -0400 (0:00:00.191) 0:08:53.262 ******** skipping: [managed-node9] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Saturday 21 March 2026 19:05:42 -0400 (0:00:00.198) 0:08:53.461 ******** skipping: [managed-node9] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Saturday 21 March 2026 19:05:42 -0400 (0:00:00.330) 0:08:53.791 ******** skipping: [managed-node9] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Saturday 21 March 2026 19:05:43 -0400 (0:00:00.219) 0:08:54.011 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Saturday 21 March 2026 19:05:43 -0400 (0:00:00.286) 0:08:54.297 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Saturday 21 March 2026 19:05:43 -0400 (0:00:00.270) 0:08:54.567 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Saturday 21 March 2026 19:05:44 -0400 (0:00:00.302) 0:08:54.870 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Saturday 21 March 2026 19:05:44 -0400 (0:00:00.277) 0:08:55.148 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Saturday 21 March 2026 19:05:44 -0400 (0:00:00.274) 0:08:55.422 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Saturday 21 March 2026 19:05:44 -0400 (0:00:00.287) 0:08:55.710 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Saturday 21 March 2026 19:05:45 -0400 (0:00:00.274) 0:08:55.984 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Saturday 21 March 2026 19:05:45 -0400 (0:00:00.248) 0:08:56.232 ******** skipping: [managed-node9] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Saturday 21 March 2026 19:05:45 -0400 (0:00:00.262) 0:08:56.495 ******** skipping: [managed-node9] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Saturday 21 March 2026 19:05:45 -0400 (0:00:00.261) 0:08:56.757 ******** skipping: [managed-node9] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Saturday 21 March 2026 19:05:46 -0400 (0:00:00.297) 0:08:57.055 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Saturday 21 March 2026 19:05:46 -0400 (0:00:00.250) 0:08:57.305 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Saturday 21 March 2026 19:05:46 -0400 (0:00:00.172) 0:08:57.478 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Saturday 21 March 2026 19:05:46 -0400 (0:00:00.231) 0:08:57.710 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Saturday 21 March 2026 19:05:47 -0400 (0:00:00.257) 0:08:57.967 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Saturday 21 March 2026 19:05:47 -0400 (0:00:00.279) 0:08:58.246 ******** ok: [managed-node9] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Saturday 21 March 2026 19:05:47 -0400 (0:00:00.319) 0:08:58.566 ******** ok: [managed-node9] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Saturday 21 March 2026 19:05:47 -0400 (0:00:00.265) 0:08:58.832 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Saturday 21 March 2026 19:05:48 -0400 (0:00:00.270) 0:08:59.102 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Saturday 21 March 2026 19:05:48 -0400 (0:00:00.282) 0:08:59.384 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Saturday 21 March 2026 19:05:48 -0400 (0:00:00.302) 0:08:59.686 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Saturday 21 March 2026 19:05:49 -0400 (0:00:00.272) 0:08:59.959 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Saturday 21 March 2026 19:05:49 -0400 (0:00:00.299) 0:09:00.259 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Saturday 21 March 2026 19:05:49 -0400 (0:00:00.287) 0:09:00.546 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Saturday 21 March 2026 19:05:49 -0400 (0:00:00.253) 0:09:00.800 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Saturday 21 March 2026 19:05:50 -0400 (0:00:00.262) 0:09:01.063 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Saturday 21 March 2026 19:05:50 -0400 (0:00:00.182) 0:09:01.245 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Saturday 21 March 2026 19:05:50 -0400 (0:00:00.239) 0:09:01.484 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Saturday 21 March 2026 19:05:50 -0400 (0:00:00.189) 0:09:01.674 ******** changed: [managed-node9] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode - 3] ****************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:234 Saturday 21 March 2026 19:05:52 -0400 (0:00:01.420) 0:09:03.094 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node9 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Saturday 21 March 2026 19:05:52 -0400 (0:00:00.431) 0:09:03.526 ******** ok: [managed-node9] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Saturday 21 March 2026 19:05:53 -0400 (0:00:00.419) 0:09:03.945 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node9 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:24 Saturday 21 March 2026 19:05:53 -0400 (0:00:00.335) 0:09:04.280 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:34 Saturday 21 March 2026 19:05:53 -0400 (0:00:00.190) 0:09:04.471 ******** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Saturday 21 March 2026 19:05:53 -0400 (0:00:00.195) 0:09:04.667 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Saturday 21 March 2026 19:05:53 -0400 (0:00:00.127) 0:09:04.794 ******** ok: [managed-node9] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Saturday 21 March 2026 19:05:56 -0400 (0:00:02.121) 0:09:06.916 ******** skipping: [managed-node9] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node9] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node9] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node9] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Saturday 21 March 2026 19:05:56 -0400 (0:00:00.460) 0:09:07.377 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Saturday 21 March 2026 19:05:56 -0400 (0:00:00.266) 0:09:07.643 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Saturday 21 March 2026 19:05:57 -0400 (0:00:00.278) 0:09:07.922 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Saturday 21 March 2026 19:05:57 -0400 (0:00:00.274) 0:09:08.196 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Saturday 21 March 2026 19:05:57 -0400 (0:00:00.156) 0:09:08.353 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Saturday 21 March 2026 19:05:57 -0400 (0:00:00.480) 0:09:08.834 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Saturday 21 March 2026 19:05:58 -0400 (0:00:00.196) 0:09:09.030 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Saturday 21 March 2026 19:05:58 -0400 (0:00:00.141) 0:09:09.172 ******** ok: [managed-node9] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kmod-kvdo libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python3-blivet stratis-cli stratisd vdo xfsprogs TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Saturday 21 March 2026 19:06:02 -0400 (0:00:04.164) 0:09:13.336 ******** ok: [managed-node9] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": false, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Saturday 21 March 2026 19:06:02 -0400 (0:00:00.182) 0:09:13.519 ******** ok: [managed-node9] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Saturday 21 March 2026 19:06:02 -0400 (0:00:00.263) 0:09:13.782 ******** ok: [managed-node9] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Saturday 21 March 2026 19:06:08 -0400 (0:00:05.424) 0:09:19.207 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Saturday 21 March 2026 19:06:08 -0400 (0:00:00.295) 0:09:19.503 ******** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Saturday 21 March 2026 19:06:08 -0400 (0:00:00.147) 0:09:19.650 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Saturday 21 March 2026 19:06:09 -0400 (0:00:00.204) 0:09:19.855 ******** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Saturday 21 March 2026 19:06:09 -0400 (0:00:00.142) 0:09:19.998 ******** ok: [managed-node9] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kpartx TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Saturday 21 March 2026 19:06:13 -0400 (0:00:03.969) 0:09:23.968 ******** ok: [managed-node9] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luk...d3e12\\x2d4b23\\x2db22c\\x2dc1ad094a5b0d.service": { "name": "systemd-cryptsetup@luk...d3e12\\x2d4b23\\x2db22c\\x2dc1ad094a5b0d.service", "source": "systemd", "state": "inactive", "status": "generated" }, "systemd-cryptsetup@luks\\x2d05196f86\\x2d3e12\\x2d4b23\\x2db22c\\x2dc1ad094a5b0d.service": { "name": "systemd-cryptsetup@luks\\x2d05196f86\\x2d3e12\\x2d4b23\\x2db22c\\x2dc1ad094a5b0d.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Saturday 21 March 2026 19:06:15 -0400 (0:00:02.484) 0:09:26.453 ******** changed: [managed-node9] => (item=systemd-cryptsetup@luks\x2d05196f86\x2d3e12\x2d4b23\x2db22c\x2dc1ad094a5b0d.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d05196f86\\x2d3e12\\x2d4b23\\x2db22c\\x2dc1ad094a5b0d.service", "name": "systemd-cryptsetup@luks\\x2d05196f86\\x2d3e12\\x2d4b23\\x2db22c\\x2dc1ad094a5b0d.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "system-systemd\\x2dcryptsetup.slice dev-sda.device cryptsetup-pre.target systemd-journald.socket", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "umount.target cryptsetup.target", "BindsTo": "dev-sda.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-05196f86-3e12-4b23-b22c-c1ad094a5b0d", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-05196f86-3e12-4b23-b22c-c1ad094a5b0d /dev/sda - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-05196f86-3e12-4b23-b22c-c1ad094a5b0d ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d05196f86\\x2d3e12\\x2d4b23\\x2db22c\\x2dc1ad094a5b0d.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2d05196f86\\x2d3e12\\x2d4b23\\x2db22c\\x2dc1ad094a5b0d.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2d05196f86\\x2d3e12\\x2d4b23\\x2db22c\\x2dc1ad094a5b0d.service", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Sat 2026-03-21 19:04:49 EDT", "StateChangeTimestampMonotonic": "2052450744", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "infinity", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "WantedBy": "dev-sda.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node9] => (item=systemd-cryptsetup@luk...d3e12\x2d4b23\x2db22c\x2dc1ad094a5b0d.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...d3e12\\x2d4b23\\x2db22c\\x2dc1ad094a5b0d.service", "name": "systemd-cryptsetup@luk...d3e12\\x2d4b23\\x2db22c\\x2dc1ad094a5b0d.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...d3e12\\x2d4b23\\x2db22c\\x2dc1ad094a5b0d.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...d3e12\\x2d4b23\\x2db22c\\x2dc1ad094a5b0d.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.NoSuchUnit \"Unit systemd-cryptsetup@luk...d3e12\\x2d4b23\\x2db22c\\x2dc1ad094a5b0d.service not found.\"", "LoadState": "not-found", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...d3e12\\x2d4b23\\x2db22c\\x2dc1ad094a5b0d.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Saturday 21 March 2026 19:06:18 -0400 (0:00:03.114) 0:09:29.567 ******** fatal: [managed-node9]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'luks-c9479194-c226-4bd0-b788-f506f21ba738' in safe mode due to encryption removal TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129 Saturday 21 March 2026 19:06:24 -0400 (0:00:05.344) 0:09:34.912 ******** fatal: [managed-node9]: FAILED! => { "changed": false } MSG: {'msg': "cannot remove existing formatting on device 'luks-c9479194-c226-4bd0-b788-f506f21ba738' in safe mode due to encryption removal", 'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'invocation': {'module_args': {'pools': [{'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'partition', 'volumes': [{'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None}]}], 'volumes': [], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': True, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Saturday 21 March 2026 19:06:24 -0400 (0:00:00.263) 0:09:35.176 ******** changed: [managed-node9] => (item=systemd-cryptsetup@luks\x2d05196f86\x2d3e12\x2d4b23\x2db22c\x2dc1ad094a5b0d.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d05196f86\\x2d3e12\\x2d4b23\\x2db22c\\x2dc1ad094a5b0d.service", "name": "systemd-cryptsetup@luks\\x2d05196f86\\x2d3e12\\x2d4b23\\x2db22c\\x2dc1ad094a5b0d.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d05196f86\\x2d3e12\\x2d4b23\\x2db22c\\x2dc1ad094a5b0d.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2d05196f86\\x2d3e12\\x2d4b23\\x2db22c\\x2dc1ad094a5b0d.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2d05196f86\\x2d3e12\\x2d4b23\\x2db22c\\x2dc1ad094a5b0d.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2d05196f86\\x2d3e12\\x2d4b23\\x2db22c\\x2dc1ad094a5b0d.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node9] => (item=systemd-cryptsetup@luk...d3e12\x2d4b23\x2db22c\x2dc1ad094a5b0d.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...d3e12\\x2d4b23\\x2db22c\\x2dc1ad094a5b0d.service", "name": "systemd-cryptsetup@luk...d3e12\\x2d4b23\\x2db22c\\x2dc1ad094a5b0d.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...d3e12\\x2d4b23\\x2db22c\\x2dc1ad094a5b0d.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...d3e12\\x2d4b23\\x2db22c\\x2dc1ad094a5b0d.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luk...d3e12\\x2d4b23\\x2db22c\\x2dc1ad094a5b0d.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...d3e12\\x2d4b23\\x2db22c\\x2dc1ad094a5b0d.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:22 Saturday 21 March 2026 19:06:26 -0400 (0:00:02.424) 0:09:37.600 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:27 Saturday 21 March 2026 19:06:26 -0400 (0:00:00.042) 0:09:37.643 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:38 Saturday 21 March 2026 19:06:26 -0400 (0:00:00.067) 0:09:37.711 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Saturday 21 March 2026 19:06:26 -0400 (0:00:00.089) 0:09:37.801 ******** ok: [managed-node9] => { "changed": false, "stat": { "atime": 1774134352.0290198, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1774134352.0290198, "dev": 64768, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1774134352.0290198, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "2083767786", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Saturday 21 March 2026 19:06:27 -0400 (0:00:00.623) 0:09:38.424 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Remove the encryption layer - 2] ***************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:258 Saturday 21 March 2026 19:06:27 -0400 (0:00:00.080) 0:09:38.505 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node9 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:24 Saturday 21 March 2026 19:06:27 -0400 (0:00:00.232) 0:09:38.737 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:34 Saturday 21 March 2026 19:06:27 -0400 (0:00:00.098) 0:09:38.835 ******** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Saturday 21 March 2026 19:06:28 -0400 (0:00:00.072) 0:09:38.908 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Saturday 21 March 2026 19:06:28 -0400 (0:00:00.048) 0:09:38.957 ******** ok: [managed-node9] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Saturday 21 March 2026 19:06:29 -0400 (0:00:01.372) 0:09:40.329 ******** skipping: [managed-node9] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node9] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node9] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node9] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Saturday 21 March 2026 19:06:29 -0400 (0:00:00.385) 0:09:40.715 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Saturday 21 March 2026 19:06:30 -0400 (0:00:00.222) 0:09:40.937 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Saturday 21 March 2026 19:06:30 -0400 (0:00:00.149) 0:09:41.087 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Saturday 21 March 2026 19:06:30 -0400 (0:00:00.187) 0:09:41.274 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Saturday 21 March 2026 19:06:30 -0400 (0:00:00.113) 0:09:41.387 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Saturday 21 March 2026 19:06:30 -0400 (0:00:00.316) 0:09:41.703 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Saturday 21 March 2026 19:06:31 -0400 (0:00:00.462) 0:09:42.168 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Saturday 21 March 2026 19:06:31 -0400 (0:00:00.158) 0:09:42.326 ******** ok: [managed-node9] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kmod-kvdo libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python3-blivet stratis-cli stratisd vdo xfsprogs TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Saturday 21 March 2026 19:06:35 -0400 (0:00:04.315) 0:09:46.642 ******** ok: [managed-node9] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": false, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Saturday 21 March 2026 19:06:35 -0400 (0:00:00.148) 0:09:46.791 ******** ok: [managed-node9] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Saturday 21 March 2026 19:06:36 -0400 (0:00:00.131) 0:09:46.922 ******** ok: [managed-node9] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Saturday 21 March 2026 19:06:41 -0400 (0:00:05.460) 0:09:52.383 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Saturday 21 March 2026 19:06:41 -0400 (0:00:00.161) 0:09:52.545 ******** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Saturday 21 March 2026 19:06:41 -0400 (0:00:00.066) 0:09:52.611 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Saturday 21 March 2026 19:06:41 -0400 (0:00:00.086) 0:09:52.697 ******** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Saturday 21 March 2026 19:06:41 -0400 (0:00:00.085) 0:09:52.782 ******** ok: [managed-node9] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kpartx TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Saturday 21 March 2026 19:06:45 -0400 (0:00:03.755) 0:09:56.538 ******** ok: [managed-node9] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luk...dc226\\x2d4bd0\\x2db788\\x2df506f21ba738.service": { "name": "systemd-cryptsetup@luk...dc226\\x2d4bd0\\x2db788\\x2df506f21ba738.service", "source": "systemd", "state": "inactive", "status": "generated" }, "systemd-cryptsetup@luks\\x2dc9479194\\x2dc226\\x2d4bd0\\x2db788\\x2df506f21ba738.service": { "name": "systemd-cryptsetup@luks\\x2dc9479194\\x2dc226\\x2d4bd0\\x2db788\\x2df506f21ba738.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Saturday 21 March 2026 19:06:48 -0400 (0:00:02.848) 0:09:59.387 ******** changed: [managed-node9] => (item=systemd-cryptsetup@luks\x2dc9479194\x2dc226\x2d4bd0\x2db788\x2df506f21ba738.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2dc9479194\\x2dc226\\x2d4bd0\\x2db788\\x2df506f21ba738.service", "name": "systemd-cryptsetup@luks\\x2dc9479194\\x2dc226\\x2d4bd0\\x2db788\\x2df506f21ba738.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "dev-sda1.device cryptsetup-pre.target systemd-journald.socket system-systemd\\x2dcryptsetup.slice", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "umount.target cryptsetup.target", "BindsTo": "dev-sda1.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-c9479194-c226-4bd0-b788-f506f21ba738", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-c9479194-c226-4bd0-b788-f506f21ba738 /dev/sda1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-c9479194-c226-4bd0-b788-f506f21ba738 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2dc9479194\\x2dc226\\x2d4bd0\\x2db788\\x2df506f21ba738.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2dc9479194\\x2dc226\\x2d4bd0\\x2db788\\x2df506f21ba738.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2dc9479194\\x2dc226\\x2d4bd0\\x2db788\\x2df506f21ba738.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target dev-mapper-luks\\x2dc9479194\\x2dc226\\x2d4bd0\\x2db788\\x2df506f21ba738.device", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Sat 2026-03-21 19:06:18 EDT", "StateChangeTimestampMonotonic": "2141507930", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "infinity", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "WantedBy": "dev-sda1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node9] => (item=systemd-cryptsetup@luk...dc226\x2d4bd0\x2db788\x2df506f21ba738.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...dc226\\x2d4bd0\\x2db788\\x2df506f21ba738.service", "name": "systemd-cryptsetup@luk...dc226\\x2d4bd0\\x2db788\\x2df506f21ba738.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...dc226\\x2d4bd0\\x2db788\\x2df506f21ba738.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...dc226\\x2d4bd0\\x2db788\\x2df506f21ba738.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.NoSuchUnit \"Unit systemd-cryptsetup@luk...dc226\\x2d4bd0\\x2db788\\x2df506f21ba738.service not found.\"", "LoadState": "not-found", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...dc226\\x2d4bd0\\x2db788\\x2df506f21ba738.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Saturday 21 March 2026 19:06:52 -0400 (0:00:03.624) 0:10:03.011 ******** changed: [managed-node9] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-c9479194-c226-4bd0-b788-f506f21ba738", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-c9479194-c226-4bd0-b788-f506f21ba738", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-c9479194-c226-4bd0-b788-f506f21ba738", "password": "-", "state": "absent" } ], "leaves": [ "/dev/sda1", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-c9479194-c226-4bd0-b788-f506f21ba738", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=59c103d4-0981-45a8-bb35-e49fb4f41de3", "state": "mounted" } ], "packages": [ "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=59c103d4-0981-45a8-bb35-e49fb4f41de3", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:103 Saturday 21 March 2026 19:06:58 -0400 (0:00:05.867) 0:10:08.878 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:110 Saturday 21 March 2026 19:06:58 -0400 (0:00:00.292) 0:10:09.171 ******** ok: [managed-node9] => { "changed": false, "stat": { "atime": 1774134287.2386456, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "9df556205051593b1b7ae34a6eadcf8f2ef7da0a", "ctime": 1774134287.2346456, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 364904581, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1774134287.2346456, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1425, "uid": 0, "version": "133446622", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Saturday 21 March 2026 19:06:59 -0400 (0:00:01.484) 0:10:10.655 ******** ok: [managed-node9] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Saturday 21 March 2026 19:07:01 -0400 (0:00:01.516) 0:10:12.172 ******** changed: [managed-node9] => (item=systemd-cryptsetup@luks\x2dc9479194\x2dc226\x2d4bd0\x2db788\x2df506f21ba738.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2dc9479194\\x2dc226\\x2d4bd0\\x2db788\\x2df506f21ba738.service", "name": "systemd-cryptsetup@luks\\x2dc9479194\\x2dc226\\x2d4bd0\\x2db788\\x2df506f21ba738.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2dc9479194\\x2dc226\\x2d4bd0\\x2db788\\x2df506f21ba738.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2dc9479194\\x2dc226\\x2d4bd0\\x2db788\\x2df506f21ba738.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2dc9479194\\x2dc226\\x2d4bd0\\x2db788\\x2df506f21ba738.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2dc9479194\\x2dc226\\x2d4bd0\\x2db788\\x2df506f21ba738.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target dev-mapper-luks\\x2dc9479194\\x2dc226\\x2d4bd0\\x2db788\\x2df506f21ba738.device", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Sat 2026-03-21 19:06:18 EDT", "StateChangeTimestampMonotonic": "2141507930", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WantedBy": "dev-sda1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node9] => (item=systemd-cryptsetup@luk...dc226\x2d4bd0\x2db788\x2df506f21ba738.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...dc226\\x2d4bd0\\x2db788\\x2df506f21ba738.service", "name": "systemd-cryptsetup@luk...dc226\\x2d4bd0\\x2db788\\x2df506f21ba738.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...dc226\\x2d4bd0\\x2db788\\x2df506f21ba738.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...dc226\\x2d4bd0\\x2db788\\x2df506f21ba738.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luk...dc226\\x2d4bd0\\x2db788\\x2df506f21ba738.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...dc226\\x2d4bd0\\x2db788\\x2df506f21ba738.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:139 Saturday 21 March 2026 19:07:04 -0400 (0:00:03.518) 0:10:15.690 ******** ok: [managed-node9] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-c9479194-c226-4bd0-b788-f506f21ba738", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-c9479194-c226-4bd0-b788-f506f21ba738", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-c9479194-c226-4bd0-b788-f506f21ba738", "password": "-", "state": "absent" } ], "failed": false, "leaves": [ "/dev/sda1", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-c9479194-c226-4bd0-b788-f506f21ba738", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=59c103d4-0981-45a8-bb35-e49fb4f41de3", "state": "mounted" } ], "packages": [ "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=59c103d4-0981-45a8-bb35-e49fb4f41de3", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Saturday 21 March 2026 19:07:05 -0400 (0:00:00.229) 0:10:15.920 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=59c103d4-0981-45a8-bb35-e49fb4f41de3", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:152 Saturday 21 March 2026 19:07:05 -0400 (0:00:00.346) 0:10:16.267 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:168 Saturday 21 March 2026 19:07:05 -0400 (0:00:00.248) 0:10:16.515 ******** changed: [managed-node9] => (item={'src': '/dev/mapper/luks-c9479194-c226-4bd0-b788-f506f21ba738', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-c9479194-c226-4bd0-b788-f506f21ba738", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-c9479194-c226-4bd0-b788-f506f21ba738" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:179 Saturday 21 March 2026 19:07:07 -0400 (0:00:01.430) 0:10:17.945 ******** ok: [managed-node9] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:184 Saturday 21 March 2026 19:07:08 -0400 (0:00:01.817) 0:10:19.763 ******** changed: [managed-node9] => (item={'src': 'UUID=59c103d4-0981-45a8-bb35-e49fb4f41de3', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=59c103d4-0981-45a8-bb35-e49fb4f41de3", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=59c103d4-0981-45a8-bb35-e49fb4f41de3" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Saturday 21 March 2026 19:07:10 -0400 (0:00:01.422) 0:10:21.186 ******** skipping: [managed-node9] => (item={'src': 'UUID=59c103d4-0981-45a8-bb35-e49fb4f41de3', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=59c103d4-0981-45a8-bb35-e49fb4f41de3", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:207 Saturday 21 March 2026 19:07:10 -0400 (0:00:00.333) 0:10:21.520 ******** ok: [managed-node9] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:215 Saturday 21 March 2026 19:07:12 -0400 (0:00:01.740) 0:10:23.261 ******** ok: [managed-node9] => { "changed": false, "stat": { "atime": 1774134299.8427184, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "87272f66799c167be5a330dc89a92e7b129aed61", "ctime": 1774134293.6896827, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 123732165, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1774134293.6886828, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 54, "uid": 0, "version": "2180375997", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:220 Saturday 21 March 2026 19:07:14 -0400 (0:00:01.772) 0:10:25.033 ******** changed: [managed-node9] => (item={'backing_device': '/dev/sda1', 'name': 'luks-c9479194-c226-4bd0-b788-f506f21ba738', 'password': '-', 'state': 'absent'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda1", "name": "luks-c9479194-c226-4bd0-b788-f506f21ba738", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:242 Saturday 21 March 2026 19:07:15 -0400 (0:00:01.742) 0:10:26.775 ******** ok: [managed-node9] TASK [Verify role results - 5] ************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:274 Saturday 21 March 2026 19:07:18 -0400 (0:00:02.558) 0:10:29.333 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node9 TASK [Print out pool information] ********************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Saturday 21 March 2026 19:07:19 -0400 (0:00:01.249) 0:10:30.582 ******** ok: [managed-node9] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=59c103d4-0981-45a8-bb35-e49fb4f41de3", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Saturday 21 March 2026 19:07:20 -0400 (0:00:00.297) 0:10:30.879 ******** skipping: [managed-node9] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Saturday 21 March 2026 19:07:20 -0400 (0:00:00.210) 0:10:31.090 ******** ok: [managed-node9] => { "changed": false, "info": { "/dev/sda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/sda1", "size": "4G", "type": "partition", "uuid": "59c103d4-0981-45a8-bb35-e49fb4f41de3" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Saturday 21 March 2026 19:07:21 -0400 (0:00:01.315) 0:10:32.406 ******** ok: [managed-node9] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002653", "end": "2026-03-21 19:07:22.630233", "rc": 0, "start": "2026-03-21 19:07:22.627580" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 UUID=59c103d4-0981-45a8-bb35-e49fb4f41de3 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Saturday 21 March 2026 19:07:23 -0400 (0:00:01.450) 0:10:33.857 ******** ok: [managed-node9] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002800", "end": "2026-03-21 19:07:24.118091", "failed_when_result": false, "rc": 0, "start": "2026-03-21 19:07:24.115291" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Saturday 21 March 2026 19:07:24 -0400 (0:00:01.342) 0:10:35.199 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node9 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Saturday 21 March 2026 19:07:24 -0400 (0:00:00.392) 0:10:35.591 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Saturday 21 March 2026 19:07:24 -0400 (0:00:00.240) 0:10:35.832 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Saturday 21 March 2026 19:07:25 -0400 (0:00:00.198) 0:10:36.030 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Saturday 21 March 2026 19:07:25 -0400 (0:00:00.176) 0:10:36.206 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node9 included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node9 TASK [Set test variables] ****************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Saturday 21 March 2026 19:07:25 -0400 (0:00:00.370) 0:10:36.576 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Saturday 21 March 2026 19:07:25 -0400 (0:00:00.220) 0:10:36.797 ******** TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Saturday 21 March 2026 19:07:26 -0400 (0:00:00.206) 0:10:37.004 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Saturday 21 March 2026 19:07:26 -0400 (0:00:00.232) 0:10:37.236 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Saturday 21 March 2026 19:07:26 -0400 (0:00:00.160) 0:10:37.396 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Saturday 21 March 2026 19:07:26 -0400 (0:00:00.307) 0:10:37.703 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type - 2] ************************************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Saturday 21 March 2026 19:07:27 -0400 (0:00:00.198) 0:10:37.902 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type - 3] ************************************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Saturday 21 March 2026 19:07:27 -0400 (0:00:00.313) 0:10:38.215 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:55 Saturday 21 March 2026 19:07:27 -0400 (0:00:00.328) 0:10:38.544 ******** TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:68 Saturday 21 March 2026 19:07:27 -0400 (0:00:00.228) 0:10:38.772 ******** ok: [managed-node9] => { "changed": false, "failed_when_result": false, "rc": 1 } STDERR: Shared connection to 10.31.10.97 closed. MSG: non-zero return code TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:78 Saturday 21 March 2026 19:07:29 -0400 (0:00:01.682) 0:10:40.454 ******** TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:88 Saturday 21 March 2026 19:07:29 -0400 (0:00:00.237) 0:10:40.692 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node9 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Saturday 21 March 2026 19:07:30 -0400 (0:00:00.504) 0:10:41.197 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Saturday 21 March 2026 19:07:30 -0400 (0:00:00.218) 0:10:41.415 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Saturday 21 March 2026 19:07:30 -0400 (0:00:00.242) 0:10:41.658 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Saturday 21 March 2026 19:07:31 -0400 (0:00:00.258) 0:10:41.916 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Saturday 21 March 2026 19:07:31 -0400 (0:00:00.269) 0:10:42.186 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Saturday 21 March 2026 19:07:31 -0400 (0:00:00.293) 0:10:42.480 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Saturday 21 March 2026 19:07:31 -0400 (0:00:00.255) 0:10:42.735 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Saturday 21 March 2026 19:07:32 -0400 (0:00:00.260) 0:10:42.996 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Saturday 21 March 2026 19:07:32 -0400 (0:00:00.229) 0:10:43.225 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Saturday 21 March 2026 19:07:32 -0400 (0:00:00.236) 0:10:43.461 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Saturday 21 March 2026 19:07:32 -0400 (0:00:00.185) 0:10:43.647 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:91 Saturday 21 March 2026 19:07:32 -0400 (0:00:00.179) 0:10:43.827 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node9 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Saturday 21 March 2026 19:07:33 -0400 (0:00:00.439) 0:10:44.266 ******** skipping: [managed-node9] => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/sda1', '_raw_device': '/dev/sda1', '_mount_id': 'UUID=59c103d4-0981-45a8-bb35-e49fb4f41de3', '_kernel_device': '/dev/sda1', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_lvmraid_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_lvmraid_volume": { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=59c103d4-0981-45a8-bb35-e49fb4f41de3", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:94 Saturday 21 March 2026 19:07:33 -0400 (0:00:00.330) 0:10:44.597 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node9 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Saturday 21 March 2026 19:07:34 -0400 (0:00:00.322) 0:10:44.919 ******** skipping: [managed-node9] => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/sda1', '_raw_device': '/dev/sda1', '_mount_id': 'UUID=59c103d4-0981-45a8-bb35-e49fb4f41de3', '_kernel_device': '/dev/sda1', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_thin_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_thin_volume": { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=59c103d4-0981-45a8-bb35-e49fb4f41de3", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check member encryption] ************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:97 Saturday 21 March 2026 19:07:34 -0400 (0:00:00.249) 0:10:45.169 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node9 TASK [Set test variables] ****************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Saturday 21 March 2026 19:07:34 -0400 (0:00:00.409) 0:10:45.578 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Saturday 21 March 2026 19:07:34 -0400 (0:00:00.255) 0:10:45.834 ******** TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Saturday 21 March 2026 19:07:35 -0400 (0:00:00.192) 0:10:46.026 ******** TASK [Clear test variables] **************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Saturday 21 March 2026 19:07:35 -0400 (0:00:00.186) 0:10:46.213 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:100 Saturday 21 March 2026 19:07:35 -0400 (0:00:00.144) 0:10:46.358 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node9 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Saturday 21 March 2026 19:07:36 -0400 (0:00:00.563) 0:10:46.921 ******** skipping: [managed-node9] => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/sda1', '_raw_device': '/dev/sda1', '_mount_id': 'UUID=59c103d4-0981-45a8-bb35-e49fb4f41de3', '_kernel_device': '/dev/sda1', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_vdo_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_vdo_volume": { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=59c103d4-0981-45a8-bb35-e49fb4f41de3", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:103 Saturday 21 March 2026 19:07:36 -0400 (0:00:00.232) 0:10:47.153 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node9 TASK [Get stratis pool information] ******************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Saturday 21 March 2026 19:07:36 -0400 (0:00:00.463) 0:10:47.617 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print script output] ***************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Saturday 21 March 2026 19:07:36 -0400 (0:00:00.196) 0:10:47.813 ******** skipping: [managed-node9] => {} TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:19 Saturday 21 March 2026 19:07:37 -0400 (0:00:00.190) 0:10:48.004 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:23 Saturday 21 March 2026 19:07:37 -0400 (0:00:00.260) 0:10:48.264 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:30 Saturday 21 March 2026 19:07:37 -0400 (0:00:00.329) 0:10:48.594 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:39 Saturday 21 March 2026 19:07:38 -0400 (0:00:00.278) 0:10:48.872 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:49 Saturday 21 March 2026 19:07:38 -0400 (0:00:00.319) 0:10:49.192 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:106 Saturday 21 March 2026 19:07:38 -0400 (0:00:00.209) 0:10:49.401 ******** ok: [managed-node9] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Saturday 21 March 2026 19:07:38 -0400 (0:00:00.270) 0:10:49.672 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node9 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Saturday 21 March 2026 19:07:39 -0400 (0:00:00.544) 0:10:50.216 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Saturday 21 March 2026 19:07:39 -0400 (0:00:00.326) 0:10:50.543 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node9 included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node9 included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node9 included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node9 included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node9 included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node9 included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node9 included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node9 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Saturday 21 March 2026 19:07:41 -0400 (0:00:01.403) 0:10:51.946 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_device_path": "/dev/sda1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Saturday 21 March 2026 19:07:41 -0400 (0:00:00.265) 0:10:52.211 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Saturday 21 March 2026 19:07:41 -0400 (0:00:00.204) 0:10:52.416 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Saturday 21 March 2026 19:07:41 -0400 (0:00:00.364) 0:10:52.781 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Saturday 21 March 2026 19:07:42 -0400 (0:00:00.318) 0:10:53.100 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Saturday 21 March 2026 19:07:42 -0400 (0:00:00.311) 0:10:53.411 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Saturday 21 March 2026 19:07:42 -0400 (0:00:00.303) 0:10:53.714 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Saturday 21 March 2026 19:07:43 -0400 (0:00:00.281) 0:10:53.996 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Saturday 21 March 2026 19:07:43 -0400 (0:00:00.325) 0:10:54.322 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Saturday 21 March 2026 19:07:44 -0400 (0:00:00.911) 0:10:55.233 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Saturday 21 March 2026 19:07:44 -0400 (0:00:00.245) 0:10:55.479 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Saturday 21 March 2026 19:07:44 -0400 (0:00:00.241) 0:10:55.720 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "UUID=59c103d4-0981-45a8-bb35-e49fb4f41de3 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Saturday 21 March 2026 19:07:45 -0400 (0:00:00.446) 0:10:56.167 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Saturday 21 March 2026 19:07:45 -0400 (0:00:00.314) 0:10:56.482 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Saturday 21 March 2026 19:07:45 -0400 (0:00:00.264) 0:10:56.747 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Saturday 21 March 2026 19:07:46 -0400 (0:00:00.324) 0:10:57.071 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Saturday 21 March 2026 19:07:46 -0400 (0:00:00.270) 0:10:57.341 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Saturday 21 March 2026 19:07:46 -0400 (0:00:00.200) 0:10:57.542 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Saturday 21 March 2026 19:07:47 -0400 (0:00:00.332) 0:10:57.875 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Saturday 21 March 2026 19:07:47 -0400 (0:00:00.230) 0:10:58.105 ******** ok: [managed-node9] => { "changed": false, "stat": { "atime": 1774134417.564398, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1774134417.564398, "dev": 6, "device_type": 2049, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 234099, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1774134417.564398, "nlink": 1, "path": "/dev/sda1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Saturday 21 March 2026 19:07:48 -0400 (0:00:01.691) 0:10:59.797 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Saturday 21 March 2026 19:07:49 -0400 (0:00:00.369) 0:11:00.166 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Saturday 21 March 2026 19:07:49 -0400 (0:00:00.451) 0:11:00.618 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Saturday 21 March 2026 19:07:50 -0400 (0:00:00.231) 0:11:00.849 ******** ok: [managed-node9] => { "ansible_facts": { "st_volume_type": "partition" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Saturday 21 March 2026 19:07:50 -0400 (0:00:00.250) 0:11:01.100 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Saturday 21 March 2026 19:07:50 -0400 (0:00:00.236) 0:11:01.336 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Saturday 21 March 2026 19:07:50 -0400 (0:00:00.233) 0:11:01.569 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Saturday 21 March 2026 19:07:50 -0400 (0:00:00.200) 0:11:01.770 ******** ok: [managed-node9] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Saturday 21 March 2026 19:07:55 -0400 (0:00:04.307) 0:11:06.077 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Saturday 21 March 2026 19:07:55 -0400 (0:00:00.247) 0:11:06.324 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Saturday 21 March 2026 19:07:55 -0400 (0:00:00.267) 0:11:06.592 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Saturday 21 March 2026 19:07:56 -0400 (0:00:00.408) 0:11:07.000 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Saturday 21 March 2026 19:07:56 -0400 (0:00:00.265) 0:11:07.266 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Saturday 21 March 2026 19:07:56 -0400 (0:00:00.213) 0:11:07.480 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Saturday 21 March 2026 19:07:56 -0400 (0:00:00.235) 0:11:07.716 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Saturday 21 March 2026 19:07:57 -0400 (0:00:00.221) 0:11:07.938 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Saturday 21 March 2026 19:07:57 -0400 (0:00:00.228) 0:11:08.166 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Saturday 21 March 2026 19:07:57 -0400 (0:00:00.206) 0:11:08.373 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Saturday 21 March 2026 19:07:57 -0400 (0:00:00.254) 0:11:08.627 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Saturday 21 March 2026 19:07:58 -0400 (0:00:00.330) 0:11:08.957 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Saturday 21 March 2026 19:07:58 -0400 (0:00:00.296) 0:11:09.254 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Saturday 21 March 2026 19:07:58 -0400 (0:00:00.315) 0:11:09.569 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Saturday 21 March 2026 19:07:58 -0400 (0:00:00.227) 0:11:09.797 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Saturday 21 March 2026 19:07:59 -0400 (0:00:00.319) 0:11:10.116 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Saturday 21 March 2026 19:07:59 -0400 (0:00:00.133) 0:11:10.250 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Saturday 21 March 2026 19:07:59 -0400 (0:00:00.145) 0:11:10.395 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Saturday 21 March 2026 19:07:59 -0400 (0:00:00.092) 0:11:10.488 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Saturday 21 March 2026 19:07:59 -0400 (0:00:00.113) 0:11:10.602 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Saturday 21 March 2026 19:08:00 -0400 (0:00:00.314) 0:11:10.916 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Saturday 21 March 2026 19:08:00 -0400 (0:00:00.075) 0:11:10.991 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Saturday 21 March 2026 19:08:00 -0400 (0:00:00.124) 0:11:11.116 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Saturday 21 March 2026 19:08:00 -0400 (0:00:00.168) 0:11:11.285 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Saturday 21 March 2026 19:08:00 -0400 (0:00:00.100) 0:11:11.386 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Saturday 21 March 2026 19:08:00 -0400 (0:00:00.215) 0:11:11.601 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Saturday 21 March 2026 19:08:00 -0400 (0:00:00.080) 0:11:11.681 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Saturday 21 March 2026 19:08:00 -0400 (0:00:00.155) 0:11:11.837 ******** ok: [managed-node9] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Saturday 21 March 2026 19:08:01 -0400 (0:00:00.124) 0:11:11.962 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Saturday 21 March 2026 19:08:01 -0400 (0:00:00.202) 0:11:12.164 ******** skipping: [managed-node9] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Saturday 21 March 2026 19:08:01 -0400 (0:00:00.199) 0:11:12.364 ******** skipping: [managed-node9] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Saturday 21 March 2026 19:08:01 -0400 (0:00:00.292) 0:11:12.656 ******** skipping: [managed-node9] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Saturday 21 March 2026 19:08:02 -0400 (0:00:00.299) 0:11:12.956 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Saturday 21 March 2026 19:08:02 -0400 (0:00:00.262) 0:11:13.218 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Saturday 21 March 2026 19:08:02 -0400 (0:00:00.303) 0:11:13.522 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Saturday 21 March 2026 19:08:02 -0400 (0:00:00.257) 0:11:13.780 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Saturday 21 March 2026 19:08:03 -0400 (0:00:00.216) 0:11:13.996 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Saturday 21 March 2026 19:08:03 -0400 (0:00:00.278) 0:11:14.275 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Saturday 21 March 2026 19:08:03 -0400 (0:00:00.338) 0:11:14.613 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Saturday 21 March 2026 19:08:03 -0400 (0:00:00.149) 0:11:14.763 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Saturday 21 March 2026 19:08:04 -0400 (0:00:00.204) 0:11:14.968 ******** skipping: [managed-node9] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Saturday 21 March 2026 19:08:04 -0400 (0:00:00.235) 0:11:15.203 ******** skipping: [managed-node9] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Saturday 21 March 2026 19:08:04 -0400 (0:00:00.299) 0:11:15.503 ******** skipping: [managed-node9] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Saturday 21 March 2026 19:08:04 -0400 (0:00:00.291) 0:11:15.795 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Saturday 21 March 2026 19:08:05 -0400 (0:00:00.334) 0:11:16.129 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Saturday 21 March 2026 19:08:05 -0400 (0:00:00.264) 0:11:16.394 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Saturday 21 March 2026 19:08:05 -0400 (0:00:00.364) 0:11:16.758 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Saturday 21 March 2026 19:08:06 -0400 (0:00:00.286) 0:11:17.045 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Saturday 21 March 2026 19:08:06 -0400 (0:00:00.397) 0:11:17.443 ******** ok: [managed-node9] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Saturday 21 March 2026 19:08:06 -0400 (0:00:00.266) 0:11:17.709 ******** ok: [managed-node9] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Saturday 21 March 2026 19:08:07 -0400 (0:00:00.273) 0:11:17.982 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Saturday 21 March 2026 19:08:07 -0400 (0:00:00.306) 0:11:18.289 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Saturday 21 March 2026 19:08:07 -0400 (0:00:00.265) 0:11:18.554 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Saturday 21 March 2026 19:08:07 -0400 (0:00:00.195) 0:11:18.750 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Saturday 21 March 2026 19:08:08 -0400 (0:00:00.214) 0:11:18.964 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Saturday 21 March 2026 19:08:08 -0400 (0:00:00.203) 0:11:19.167 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Saturday 21 March 2026 19:08:08 -0400 (0:00:00.140) 0:11:19.308 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Saturday 21 March 2026 19:08:08 -0400 (0:00:00.185) 0:11:19.493 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Saturday 21 March 2026 19:08:08 -0400 (0:00:00.198) 0:11:19.691 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Saturday 21 March 2026 19:08:08 -0400 (0:00:00.119) 0:11:19.811 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Saturday 21 March 2026 19:08:09 -0400 (0:00:00.258) 0:11:20.069 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Saturday 21 March 2026 19:08:09 -0400 (0:00:00.133) 0:11:20.203 ******** changed: [managed-node9] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode - 4] ****************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:280 Saturday 21 March 2026 19:08:10 -0400 (0:00:01.432) 0:11:21.636 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node9 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Saturday 21 March 2026 19:08:11 -0400 (0:00:00.692) 0:11:22.329 ******** ok: [managed-node9] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Saturday 21 March 2026 19:08:11 -0400 (0:00:00.315) 0:11:22.645 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node9 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:24 Saturday 21 March 2026 19:08:12 -0400 (0:00:00.430) 0:11:23.076 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:34 Saturday 21 March 2026 19:08:12 -0400 (0:00:00.246) 0:11:23.322 ******** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Saturday 21 March 2026 19:08:12 -0400 (0:00:00.347) 0:11:23.669 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Saturday 21 March 2026 19:08:13 -0400 (0:00:00.248) 0:11:23.918 ******** ok: [managed-node9] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Saturday 21 March 2026 19:08:15 -0400 (0:00:02.029) 0:11:25.947 ******** skipping: [managed-node9] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node9] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node9] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node9] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Saturday 21 March 2026 19:08:15 -0400 (0:00:00.417) 0:11:26.365 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Saturday 21 March 2026 19:08:15 -0400 (0:00:00.282) 0:11:26.647 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Saturday 21 March 2026 19:08:16 -0400 (0:00:00.323) 0:11:26.970 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Saturday 21 March 2026 19:08:16 -0400 (0:00:00.183) 0:11:27.153 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Saturday 21 March 2026 19:08:16 -0400 (0:00:00.137) 0:11:27.291 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Saturday 21 March 2026 19:08:16 -0400 (0:00:00.496) 0:11:27.787 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Saturday 21 March 2026 19:08:17 -0400 (0:00:00.189) 0:11:27.976 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Saturday 21 March 2026 19:08:17 -0400 (0:00:00.219) 0:11:28.195 ******** ok: [managed-node9] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kmod-kvdo libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python3-blivet stratis-cli stratisd vdo xfsprogs TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Saturday 21 March 2026 19:08:21 -0400 (0:00:03.959) 0:11:32.155 ******** ok: [managed-node9] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Saturday 21 March 2026 19:08:21 -0400 (0:00:00.223) 0:11:32.378 ******** ok: [managed-node9] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Saturday 21 March 2026 19:08:21 -0400 (0:00:00.256) 0:11:32.635 ******** ok: [managed-node9] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Saturday 21 March 2026 19:08:27 -0400 (0:00:05.385) 0:11:38.020 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Saturday 21 March 2026 19:08:27 -0400 (0:00:00.267) 0:11:38.287 ******** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Saturday 21 March 2026 19:08:27 -0400 (0:00:00.136) 0:11:38.423 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Saturday 21 March 2026 19:08:27 -0400 (0:00:00.198) 0:11:38.622 ******** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Saturday 21 March 2026 19:08:28 -0400 (0:00:00.226) 0:11:38.848 ******** ok: [managed-node9] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup kpartx TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Saturday 21 March 2026 19:08:32 -0400 (0:00:04.692) 0:11:43.541 ******** ok: [managed-node9] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luk...dc226\\x2d4bd0\\x2db788\\x2df506f21ba738.service": { "name": "systemd-cryptsetup@luk...dc226\\x2d4bd0\\x2db788\\x2df506f21ba738.service", "source": "systemd", "state": "inactive", "status": "generated" }, "systemd-cryptsetup@luks\\x2dc9479194\\x2dc226\\x2d4bd0\\x2db788\\x2df506f21ba738.service": { "name": "systemd-cryptsetup@luks\\x2dc9479194\\x2dc226\\x2d4bd0\\x2db788\\x2df506f21ba738.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Saturday 21 March 2026 19:08:35 -0400 (0:00:03.033) 0:11:46.574 ******** changed: [managed-node9] => (item=systemd-cryptsetup@luks\x2dc9479194\x2dc226\x2d4bd0\x2db788\x2df506f21ba738.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2dc9479194\\x2dc226\\x2d4bd0\\x2db788\\x2df506f21ba738.service", "name": "systemd-cryptsetup@luks\\x2dc9479194\\x2dc226\\x2d4bd0\\x2db788\\x2df506f21ba738.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "system-systemd\\x2dcryptsetup.slice dev-sda1.device cryptsetup-pre.target systemd-journald.socket", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "cryptsetup.target umount.target", "BindsTo": "dev-sda1.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-c9479194-c226-4bd0-b788-f506f21ba738", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-c9479194-c226-4bd0-b788-f506f21ba738 /dev/sda1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-c9479194-c226-4bd0-b788-f506f21ba738 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2dc9479194\\x2dc226\\x2d4bd0\\x2db788\\x2df506f21ba738.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2dc9479194\\x2dc226\\x2d4bd0\\x2db788\\x2df506f21ba738.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2dc9479194\\x2dc226\\x2d4bd0\\x2db788\\x2df506f21ba738.service", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Sat 2026-03-21 19:06:18 EDT", "StateChangeTimestampMonotonic": "2141507930", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "infinity", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "WantedBy": "dev-sda1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node9] => (item=systemd-cryptsetup@luk...dc226\x2d4bd0\x2db788\x2df506f21ba738.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...dc226\\x2d4bd0\\x2db788\\x2df506f21ba738.service", "name": "systemd-cryptsetup@luk...dc226\\x2d4bd0\\x2db788\\x2df506f21ba738.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...dc226\\x2d4bd0\\x2db788\\x2df506f21ba738.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...dc226\\x2d4bd0\\x2db788\\x2df506f21ba738.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.NoSuchUnit \"Unit systemd-cryptsetup@luk...dc226\\x2d4bd0\\x2db788\\x2df506f21ba738.service not found.\"", "LoadState": "not-found", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...dc226\\x2d4bd0\\x2db788\\x2df506f21ba738.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Saturday 21 March 2026 19:08:39 -0400 (0:00:03.590) 0:11:50.164 ******** fatal: [managed-node9]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'sda1' in safe mode due to adding encryption TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129 Saturday 21 March 2026 19:08:44 -0400 (0:00:05.388) 0:11:55.552 ******** fatal: [managed-node9]: FAILED! => { "changed": false } MSG: {'msg': "cannot remove existing formatting on device 'sda1' in safe mode due to adding encryption", 'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'invocation': {'module_args': {'pools': [{'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'partition', 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None}]}], 'volumes': [], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': True, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Saturday 21 March 2026 19:08:44 -0400 (0:00:00.223) 0:11:55.776 ******** changed: [managed-node9] => (item=systemd-cryptsetup@luks\x2dc9479194\x2dc226\x2d4bd0\x2db788\x2df506f21ba738.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2dc9479194\\x2dc226\\x2d4bd0\\x2db788\\x2df506f21ba738.service", "name": "systemd-cryptsetup@luks\\x2dc9479194\\x2dc226\\x2d4bd0\\x2db788\\x2df506f21ba738.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2dc9479194\\x2dc226\\x2d4bd0\\x2db788\\x2df506f21ba738.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2dc9479194\\x2dc226\\x2d4bd0\\x2db788\\x2df506f21ba738.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2dc9479194\\x2dc226\\x2d4bd0\\x2db788\\x2df506f21ba738.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2dc9479194\\x2dc226\\x2d4bd0\\x2db788\\x2df506f21ba738.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node9] => (item=systemd-cryptsetup@luk...dc226\x2d4bd0\x2db788\x2df506f21ba738.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...dc226\\x2d4bd0\\x2db788\\x2df506f21ba738.service", "name": "systemd-cryptsetup@luk...dc226\\x2d4bd0\\x2db788\\x2df506f21ba738.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...dc226\\x2d4bd0\\x2db788\\x2df506f21ba738.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...dc226\\x2d4bd0\\x2db788\\x2df506f21ba738.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luk...dc226\\x2d4bd0\\x2db788\\x2df506f21ba738.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...dc226\\x2d4bd0\\x2db788\\x2df506f21ba738.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:22 Saturday 21 March 2026 19:08:48 -0400 (0:00:03.521) 0:11:59.298 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:27 Saturday 21 March 2026 19:08:48 -0400 (0:00:00.204) 0:11:59.502 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:38 Saturday 21 March 2026 19:08:48 -0400 (0:00:00.233) 0:11:59.735 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Saturday 21 March 2026 19:08:49 -0400 (0:00:00.125) 0:11:59.861 ******** ok: [managed-node9] => { "changed": false, "stat": { "atime": 1774134490.556819, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1774134490.556819, "dev": 2049, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1774134490.556819, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "4195486640", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Saturday 21 March 2026 19:08:49 -0400 (0:00:00.939) 0:12:00.800 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Create a key file] ******************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:306 Saturday 21 March 2026 19:08:50 -0400 (0:00:00.134) 0:12:00.935 ******** ok: [managed-node9] => { "changed": false, "gid": 0, "group": "root", "mode": "0600", "owner": "root", "path": "/tmp/storage_testno1nigzclukskey", "secontext": "unconfined_u:object_r:user_tmp_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Write the key into the key file] ***************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:313 Saturday 21 March 2026 19:08:52 -0400 (0:00:02.896) 0:12:03.832 ******** ok: [managed-node9] => { "changed": false, "checksum": "7a4dff3752e2baf5617c57eaac048e2b95e8af91", "dest": "/tmp/storage_testno1nigzclukskey", "gid": 0, "group": "root", "md5sum": "4ac07b967150835c00d0865161e48744", "mode": "0600", "owner": "root", "secontext": "unconfined_u:object_r:user_tmp_t:s0", "size": 32, "src": "/root/.ansible/tmp/ansible-tmp-1774134533.2460241-199053-154232626957988/source", "state": "file", "uid": 0 } TASK [Add encryption to the volume - 2] **************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:320 Saturday 21 March 2026 19:08:57 -0400 (0:00:04.181) 0:12:08.013 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node9 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:24 Saturday 21 March 2026 19:08:57 -0400 (0:00:00.189) 0:12:08.203 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:34 Saturday 21 March 2026 19:08:57 -0400 (0:00:00.189) 0:12:08.392 ******** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Saturday 21 March 2026 19:08:57 -0400 (0:00:00.299) 0:12:08.691 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Saturday 21 March 2026 19:08:58 -0400 (0:00:00.175) 0:12:08.867 ******** ok: [managed-node9] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Saturday 21 March 2026 19:09:00 -0400 (0:00:02.062) 0:12:10.929 ******** skipping: [managed-node9] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node9] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node9] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node9] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Saturday 21 March 2026 19:09:00 -0400 (0:00:00.385) 0:12:11.315 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Saturday 21 March 2026 19:09:00 -0400 (0:00:00.265) 0:12:11.581 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Saturday 21 March 2026 19:09:01 -0400 (0:00:00.293) 0:12:11.875 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Saturday 21 March 2026 19:09:01 -0400 (0:00:00.179) 0:12:12.054 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Saturday 21 March 2026 19:09:01 -0400 (0:00:00.260) 0:12:12.314 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Saturday 21 March 2026 19:09:02 -0400 (0:00:00.592) 0:12:12.907 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Saturday 21 March 2026 19:09:02 -0400 (0:00:00.224) 0:12:13.132 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Saturday 21 March 2026 19:09:02 -0400 (0:00:00.195) 0:12:13.328 ******** ok: [managed-node9] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kmod-kvdo libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python3-blivet stratis-cli stratisd vdo xfsprogs TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Saturday 21 March 2026 19:09:07 -0400 (0:00:04.629) 0:12:17.957 ******** ok: [managed-node9] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": true, "encryption_key": "/tmp/storage_testno1nigzclukskey", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Saturday 21 March 2026 19:09:07 -0400 (0:00:00.202) 0:12:18.160 ******** ok: [managed-node9] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Saturday 21 March 2026 19:09:07 -0400 (0:00:00.253) 0:12:18.414 ******** ok: [managed-node9] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Saturday 21 March 2026 19:09:13 -0400 (0:00:05.542) 0:12:23.957 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Saturday 21 March 2026 19:09:13 -0400 (0:00:00.400) 0:12:24.358 ******** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Saturday 21 March 2026 19:09:13 -0400 (0:00:00.144) 0:12:24.502 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Saturday 21 March 2026 19:09:13 -0400 (0:00:00.245) 0:12:24.748 ******** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Saturday 21 March 2026 19:09:14 -0400 (0:00:00.169) 0:12:24.918 ******** ok: [managed-node9] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup kpartx TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Saturday 21 March 2026 19:09:18 -0400 (0:00:04.256) 0:12:29.174 ******** ok: [managed-node9] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Saturday 21 March 2026 19:09:21 -0400 (0:00:02.856) 0:12:32.030 ******** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Saturday 21 March 2026 19:09:21 -0400 (0:00:00.339) 0:12:32.370 ******** changed: [managed-node9] => { "actions": [ { "action": "destroy format", "device": "/dev/sda1", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-03531268-47a9-4948-bc77-34ba208d0f8e", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-03531268-47a9-4948-bc77-34ba208d0f8e", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-03531268-47a9-4948-bc77-34ba208d0f8e", "password": "/tmp/storage_testno1nigzclukskey", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-03531268-47a9-4948-bc77-34ba208d0f8e" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=59c103d4-0981-45a8-bb35-e49fb4f41de3", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-03531268-47a9-4948-bc77-34ba208d0f8e", "state": "mounted" } ], "packages": [ "xfsprogs", "cryptsetup" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-03531268-47a9-4948-bc77-34ba208d0f8e", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-03531268-47a9-4948-bc77-34ba208d0f8e", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_testno1nigzclukskey", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:103 Saturday 21 March 2026 19:09:35 -0400 (0:00:13.616) 0:12:45.986 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:110 Saturday 21 March 2026 19:09:35 -0400 (0:00:00.146) 0:12:46.132 ******** ok: [managed-node9] => { "changed": false, "stat": { "atime": 1774134430.1044703, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "53b89c8e3fb801076f9956f80e854770ff721942", "ctime": 1774134430.1014702, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 364904581, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1774134430.1014702, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1413, "uid": 0, "version": "133446622", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Saturday 21 March 2026 19:09:36 -0400 (0:00:01.573) 0:12:47.706 ******** ok: [managed-node9] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Saturday 21 March 2026 19:09:38 -0400 (0:00:01.736) 0:12:49.442 ******** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:139 Saturday 21 March 2026 19:09:38 -0400 (0:00:00.300) 0:12:49.743 ******** ok: [managed-node9] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/sda1", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-03531268-47a9-4948-bc77-34ba208d0f8e", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-03531268-47a9-4948-bc77-34ba208d0f8e", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-03531268-47a9-4948-bc77-34ba208d0f8e", "password": "/tmp/storage_testno1nigzclukskey", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-03531268-47a9-4948-bc77-34ba208d0f8e" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=59c103d4-0981-45a8-bb35-e49fb4f41de3", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-03531268-47a9-4948-bc77-34ba208d0f8e", "state": "mounted" } ], "packages": [ "xfsprogs", "cryptsetup" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-03531268-47a9-4948-bc77-34ba208d0f8e", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-03531268-47a9-4948-bc77-34ba208d0f8e", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_testno1nigzclukskey", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Saturday 21 March 2026 19:09:39 -0400 (0:00:00.275) 0:12:50.018 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-03531268-47a9-4948-bc77-34ba208d0f8e", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-03531268-47a9-4948-bc77-34ba208d0f8e", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_testno1nigzclukskey", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:152 Saturday 21 March 2026 19:09:39 -0400 (0:00:00.276) 0:12:50.295 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:168 Saturday 21 March 2026 19:09:39 -0400 (0:00:00.241) 0:12:50.537 ******** changed: [managed-node9] => (item={'src': 'UUID=59c103d4-0981-45a8-bb35-e49fb4f41de3', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=59c103d4-0981-45a8-bb35-e49fb4f41de3", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=59c103d4-0981-45a8-bb35-e49fb4f41de3" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:179 Saturday 21 March 2026 19:09:41 -0400 (0:00:01.464) 0:12:52.002 ******** ok: [managed-node9] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:184 Saturday 21 March 2026 19:09:42 -0400 (0:00:01.731) 0:12:53.734 ******** changed: [managed-node9] => (item={'src': '/dev/mapper/luks-03531268-47a9-4948-bc77-34ba208d0f8e', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-03531268-47a9-4948-bc77-34ba208d0f8e", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-03531268-47a9-4948-bc77-34ba208d0f8e" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Saturday 21 March 2026 19:09:44 -0400 (0:00:01.625) 0:12:55.359 ******** skipping: [managed-node9] => (item={'src': '/dev/mapper/luks-03531268-47a9-4948-bc77-34ba208d0f8e', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-03531268-47a9-4948-bc77-34ba208d0f8e", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:207 Saturday 21 March 2026 19:09:44 -0400 (0:00:00.253) 0:12:55.613 ******** ok: [managed-node9] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:215 Saturday 21 March 2026 19:09:46 -0400 (0:00:02.144) 0:12:57.757 ******** ok: [managed-node9] => { "changed": false, "stat": { "atime": 1774134444.1165512, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1774134435.6115022, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 287310338, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1774134435.610502, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "1516072617", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:220 Saturday 21 March 2026 19:09:48 -0400 (0:00:01.414) 0:12:59.172 ******** changed: [managed-node9] => (item={'backing_device': '/dev/sda1', 'name': 'luks-03531268-47a9-4948-bc77-34ba208d0f8e', 'password': '/tmp/storage_testno1nigzclukskey', 'state': 'present'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda1", "name": "luks-03531268-47a9-4948-bc77-34ba208d0f8e", "password": "/tmp/storage_testno1nigzclukskey", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:242 Saturday 21 March 2026 19:09:50 -0400 (0:00:01.728) 0:13:00.901 ******** ok: [managed-node9] TASK [Verify role results - 6] ************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:336 Saturday 21 March 2026 19:09:52 -0400 (0:00:01.977) 0:13:02.878 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node9 TASK [Print out pool information] ********************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Saturday 21 March 2026 19:09:52 -0400 (0:00:00.356) 0:13:03.235 ******** ok: [managed-node9] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-03531268-47a9-4948-bc77-34ba208d0f8e", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-03531268-47a9-4948-bc77-34ba208d0f8e", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_testno1nigzclukskey", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Saturday 21 March 2026 19:09:52 -0400 (0:00:00.198) 0:13:03.433 ******** skipping: [managed-node9] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Saturday 21 March 2026 19:09:52 -0400 (0:00:00.164) 0:13:03.597 ******** ok: [managed-node9] => { "changed": false, "info": { "/dev/mapper/luks-03531268-47a9-4948-bc77-34ba208d0f8e": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-03531268-47a9-4948-bc77-34ba208d0f8e", "size": "4G", "type": "crypt", "uuid": "36d7680a-a9f5-4943-a036-0eb27e478f7e" }, "/dev/sda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/sda1", "size": "4G", "type": "partition", "uuid": "03531268-47a9-4948-bc77-34ba208d0f8e" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Saturday 21 March 2026 19:09:54 -0400 (0:00:01.659) 0:13:05.257 ******** ok: [managed-node9] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002638", "end": "2026-03-21 19:09:55.784513", "rc": 0, "start": "2026-03-21 19:09:55.781875" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-03531268-47a9-4948-bc77-34ba208d0f8e /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Saturday 21 March 2026 19:09:55 -0400 (0:00:01.562) 0:13:06.819 ******** ok: [managed-node9] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002668", "end": "2026-03-21 19:09:56.981567", "failed_when_result": false, "rc": 0, "start": "2026-03-21 19:09:56.978899" } STDOUT: luks-03531268-47a9-4948-bc77-34ba208d0f8e /dev/sda1 /tmp/storage_testno1nigzclukskey TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Saturday 21 March 2026 19:09:57 -0400 (0:00:01.269) 0:13:08.089 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node9 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Saturday 21 March 2026 19:09:57 -0400 (0:00:00.275) 0:13:08.364 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Saturday 21 March 2026 19:09:57 -0400 (0:00:00.087) 0:13:08.451 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Saturday 21 March 2026 19:09:57 -0400 (0:00:00.136) 0:13:08.587 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Saturday 21 March 2026 19:09:57 -0400 (0:00:00.134) 0:13:08.722 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node9 included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node9 TASK [Set test variables] ****************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Saturday 21 March 2026 19:09:58 -0400 (0:00:00.414) 0:13:09.136 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Saturday 21 March 2026 19:09:58 -0400 (0:00:00.170) 0:13:09.307 ******** TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Saturday 21 March 2026 19:09:58 -0400 (0:00:00.253) 0:13:09.561 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Saturday 21 March 2026 19:09:58 -0400 (0:00:00.226) 0:13:09.787 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Saturday 21 March 2026 19:09:59 -0400 (0:00:00.263) 0:13:10.051 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Saturday 21 March 2026 19:09:59 -0400 (0:00:00.230) 0:13:10.281 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type - 2] ************************************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Saturday 21 March 2026 19:09:59 -0400 (0:00:00.249) 0:13:10.530 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type - 3] ************************************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Saturday 21 March 2026 19:10:00 -0400 (0:00:00.377) 0:13:10.907 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:55 Saturday 21 March 2026 19:10:00 -0400 (0:00:00.384) 0:13:11.292 ******** TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:68 Saturday 21 March 2026 19:10:00 -0400 (0:00:00.214) 0:13:11.506 ******** ok: [managed-node9] => { "changed": false, "failed_when_result": false, "rc": 1 } STDERR: Shared connection to 10.31.10.97 closed. MSG: non-zero return code TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:78 Saturday 21 March 2026 19:10:02 -0400 (0:00:01.564) 0:13:13.070 ******** TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:88 Saturday 21 March 2026 19:10:02 -0400 (0:00:00.262) 0:13:13.333 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node9 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Saturday 21 March 2026 19:10:02 -0400 (0:00:00.369) 0:13:13.702 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Saturday 21 March 2026 19:10:03 -0400 (0:00:00.267) 0:13:13.970 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Saturday 21 March 2026 19:10:03 -0400 (0:00:00.260) 0:13:14.231 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Saturday 21 March 2026 19:10:03 -0400 (0:00:00.254) 0:13:14.486 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Saturday 21 March 2026 19:10:03 -0400 (0:00:00.180) 0:13:14.666 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Saturday 21 March 2026 19:10:04 -0400 (0:00:00.259) 0:13:14.926 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Saturday 21 March 2026 19:10:04 -0400 (0:00:00.241) 0:13:15.168 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Saturday 21 March 2026 19:10:04 -0400 (0:00:00.272) 0:13:15.440 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Saturday 21 March 2026 19:10:04 -0400 (0:00:00.191) 0:13:15.632 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Saturday 21 March 2026 19:10:05 -0400 (0:00:00.228) 0:13:15.860 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Saturday 21 March 2026 19:10:05 -0400 (0:00:00.237) 0:13:16.098 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:91 Saturday 21 March 2026 19:10:05 -0400 (0:00:00.216) 0:13:16.314 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node9 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Saturday 21 March 2026 19:10:05 -0400 (0:00:00.428) 0:13:16.743 ******** skipping: [managed-node9] => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': '/tmp/storage_testno1nigzclukskey', 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-03531268-47a9-4948-bc77-34ba208d0f8e', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-03531268-47a9-4948-bc77-34ba208d0f8e', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_lvmraid_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_lvmraid_volume": { "_device": "/dev/mapper/luks-03531268-47a9-4948-bc77-34ba208d0f8e", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-03531268-47a9-4948-bc77-34ba208d0f8e", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_testno1nigzclukskey", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:94 Saturday 21 March 2026 19:10:06 -0400 (0:00:00.147) 0:13:16.891 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node9 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Saturday 21 March 2026 19:10:06 -0400 (0:00:00.438) 0:13:17.329 ******** skipping: [managed-node9] => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': '/tmp/storage_testno1nigzclukskey', 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-03531268-47a9-4948-bc77-34ba208d0f8e', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-03531268-47a9-4948-bc77-34ba208d0f8e', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_thin_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_thin_volume": { "_device": "/dev/mapper/luks-03531268-47a9-4948-bc77-34ba208d0f8e", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-03531268-47a9-4948-bc77-34ba208d0f8e", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_testno1nigzclukskey", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check member encryption] ************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:97 Saturday 21 March 2026 19:10:06 -0400 (0:00:00.309) 0:13:17.638 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node9 TASK [Set test variables] ****************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Saturday 21 March 2026 19:10:07 -0400 (0:00:00.364) 0:13:18.002 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Saturday 21 March 2026 19:10:07 -0400 (0:00:00.268) 0:13:18.271 ******** TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Saturday 21 March 2026 19:10:07 -0400 (0:00:00.177) 0:13:18.449 ******** TASK [Clear test variables] **************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Saturday 21 March 2026 19:10:07 -0400 (0:00:00.153) 0:13:18.602 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:100 Saturday 21 March 2026 19:10:08 -0400 (0:00:00.712) 0:13:19.314 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node9 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Saturday 21 March 2026 19:10:08 -0400 (0:00:00.433) 0:13:19.747 ******** skipping: [managed-node9] => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': '/tmp/storage_testno1nigzclukskey', 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-03531268-47a9-4948-bc77-34ba208d0f8e', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-03531268-47a9-4948-bc77-34ba208d0f8e', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_vdo_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_vdo_volume": { "_device": "/dev/mapper/luks-03531268-47a9-4948-bc77-34ba208d0f8e", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-03531268-47a9-4948-bc77-34ba208d0f8e", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_testno1nigzclukskey", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:103 Saturday 21 March 2026 19:10:09 -0400 (0:00:00.290) 0:13:20.038 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node9 TASK [Get stratis pool information] ******************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Saturday 21 March 2026 19:10:09 -0400 (0:00:00.576) 0:13:20.614 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print script output] ***************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Saturday 21 March 2026 19:10:09 -0400 (0:00:00.219) 0:13:20.833 ******** skipping: [managed-node9] => {} TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:19 Saturday 21 March 2026 19:10:10 -0400 (0:00:00.238) 0:13:21.072 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:23 Saturday 21 March 2026 19:10:10 -0400 (0:00:00.172) 0:13:21.244 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:30 Saturday 21 March 2026 19:10:10 -0400 (0:00:00.173) 0:13:21.417 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:39 Saturday 21 March 2026 19:10:10 -0400 (0:00:00.208) 0:13:21.626 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:49 Saturday 21 March 2026 19:10:10 -0400 (0:00:00.174) 0:13:21.801 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:106 Saturday 21 March 2026 19:10:11 -0400 (0:00:00.127) 0:13:21.929 ******** ok: [managed-node9] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Saturday 21 March 2026 19:10:11 -0400 (0:00:00.144) 0:13:22.073 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node9 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Saturday 21 March 2026 19:10:11 -0400 (0:00:00.372) 0:13:22.445 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Saturday 21 March 2026 19:10:11 -0400 (0:00:00.272) 0:13:22.718 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node9 included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node9 included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node9 included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node9 included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node9 included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node9 included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node9 included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node9 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Saturday 21 March 2026 19:10:12 -0400 (0:00:01.041) 0:13:23.760 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-03531268-47a9-4948-bc77-34ba208d0f8e" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Saturday 21 March 2026 19:10:13 -0400 (0:00:00.176) 0:13:23.936 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Saturday 21 March 2026 19:10:13 -0400 (0:00:00.218) 0:13:24.154 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Saturday 21 March 2026 19:10:13 -0400 (0:00:00.261) 0:13:24.416 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Saturday 21 March 2026 19:10:13 -0400 (0:00:00.256) 0:13:24.672 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Saturday 21 March 2026 19:10:14 -0400 (0:00:00.238) 0:13:24.911 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Saturday 21 March 2026 19:10:14 -0400 (0:00:00.295) 0:13:25.207 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Saturday 21 March 2026 19:10:14 -0400 (0:00:00.209) 0:13:25.416 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Saturday 21 March 2026 19:10:14 -0400 (0:00:00.204) 0:13:25.620 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Saturday 21 March 2026 19:10:15 -0400 (0:00:00.240) 0:13:25.861 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Saturday 21 March 2026 19:10:15 -0400 (0:00:00.250) 0:13:26.112 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Saturday 21 March 2026 19:10:15 -0400 (0:00:00.155) 0:13:26.268 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-03531268-47a9-4948-bc77-34ba208d0f8e " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Saturday 21 March 2026 19:10:15 -0400 (0:00:00.355) 0:13:26.624 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Saturday 21 March 2026 19:10:15 -0400 (0:00:00.199) 0:13:26.823 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Saturday 21 March 2026 19:10:16 -0400 (0:00:00.174) 0:13:26.998 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Saturday 21 March 2026 19:10:16 -0400 (0:00:00.112) 0:13:27.110 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Saturday 21 March 2026 19:10:16 -0400 (0:00:00.180) 0:13:27.291 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Saturday 21 March 2026 19:10:16 -0400 (0:00:00.158) 0:13:27.449 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Saturday 21 March 2026 19:10:16 -0400 (0:00:00.200) 0:13:27.650 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Saturday 21 March 2026 19:10:17 -0400 (0:00:00.325) 0:13:27.975 ******** ok: [managed-node9] => { "changed": false, "stat": { "atime": 1774134574.641304, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1774134574.641304, "dev": 6, "device_type": 2049, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 234099, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1774134574.641304, "nlink": 1, "path": "/dev/sda1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Saturday 21 March 2026 19:10:18 -0400 (0:00:01.388) 0:13:29.364 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Saturday 21 March 2026 19:10:18 -0400 (0:00:00.231) 0:13:29.596 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Saturday 21 March 2026 19:10:18 -0400 (0:00:00.236) 0:13:29.832 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Saturday 21 March 2026 19:10:19 -0400 (0:00:00.146) 0:13:29.979 ******** ok: [managed-node9] => { "ansible_facts": { "st_volume_type": "partition" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Saturday 21 March 2026 19:10:19 -0400 (0:00:00.134) 0:13:30.113 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Saturday 21 March 2026 19:10:19 -0400 (0:00:00.332) 0:13:30.446 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Saturday 21 March 2026 19:10:19 -0400 (0:00:00.206) 0:13:30.652 ******** ok: [managed-node9] => { "changed": false, "stat": { "atime": 1774134574.800305, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1774134574.800305, "dev": 6, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 250474, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1774134574.800305, "nlink": 1, "path": "/dev/mapper/luks-03531268-47a9-4948-bc77-34ba208d0f8e", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Saturday 21 March 2026 19:10:21 -0400 (0:00:01.251) 0:13:31.904 ******** ok: [managed-node9] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Saturday 21 March 2026 19:10:25 -0400 (0:00:03.957) 0:13:35.862 ******** ok: [managed-node9] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/sda1" ], "delta": "0:00:00.010643", "end": "2026-03-21 19:10:26.350996", "rc": 0, "start": "2026-03-21 19:10:26.340353" } STDOUT: LUKS header information Version: 2 Epoch: 3 Metadata area: 16384 [bytes] Keyslots area: 16744448 [bytes] UUID: 03531268-47a9-4948-bc77-34ba208d0f8e Label: (no label) Subsystem: (no subsystem) Flags: (no flags) Data segments: 0: crypt offset: 16777216 [bytes] length: (whole device) cipher: aes-xts-plain64 sector: 512 [bytes] Keyslots: 0: luks2 Key: 512 bits Priority: normal Cipher: aes-xts-plain64 Cipher key: 512 bits PBKDF: argon2i Time cost: 4 Memory: 923141 Threads: 2 Salt: 20 ee 22 cc 0b 42 2a 53 4d ff 24 58 ce 6d 14 1e 6a 79 53 65 f5 fe 18 3b 7d 6e 70 7f 42 06 f3 f2 AF stripes: 4000 AF hash: sha256 Area offset:32768 [bytes] Area length:258048 [bytes] Digest ID: 0 Tokens: Digests: 0: pbkdf2 Hash: sha256 Iterations: 120029 Salt: f4 da b6 39 a9 3b 72 7c 70 0e 44 15 e4 0e 10 b2 84 96 df cc b7 c4 22 c7 b0 36 03 0d 39 b3 47 31 Digest: 04 77 88 33 c0 12 2a 78 e9 04 d3 ca be 49 ec 0b db 16 87 f4 b2 99 77 ce 3f a3 8e 4b a8 89 42 42 TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Saturday 21 March 2026 19:10:26 -0400 (0:00:01.563) 0:13:37.425 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Saturday 21 March 2026 19:10:26 -0400 (0:00:00.328) 0:13:37.754 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Saturday 21 March 2026 19:10:27 -0400 (0:00:00.266) 0:13:38.021 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Saturday 21 March 2026 19:10:27 -0400 (0:00:00.232) 0:13:38.253 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Saturday 21 March 2026 19:10:27 -0400 (0:00:00.212) 0:13:38.465 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Saturday 21 March 2026 19:10:27 -0400 (0:00:00.177) 0:13:38.643 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Saturday 21 March 2026 19:10:28 -0400 (0:00:00.240) 0:13:38.883 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Saturday 21 March 2026 19:10:28 -0400 (0:00:00.179) 0:13:39.063 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-03531268-47a9-4948-bc77-34ba208d0f8e /dev/sda1 /tmp/storage_testno1nigzclukskey" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "/tmp/storage_testno1nigzclukskey" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Saturday 21 March 2026 19:10:28 -0400 (0:00:00.345) 0:13:39.408 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Saturday 21 March 2026 19:10:28 -0400 (0:00:00.288) 0:13:39.696 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Saturday 21 March 2026 19:10:29 -0400 (0:00:00.280) 0:13:39.976 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Saturday 21 March 2026 19:10:29 -0400 (0:00:00.274) 0:13:40.250 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Saturday 21 March 2026 19:10:29 -0400 (0:00:00.227) 0:13:40.478 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Saturday 21 March 2026 19:10:29 -0400 (0:00:00.277) 0:13:40.755 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Saturday 21 March 2026 19:10:30 -0400 (0:00:00.385) 0:13:41.141 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Saturday 21 March 2026 19:10:30 -0400 (0:00:00.292) 0:13:41.433 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Saturday 21 March 2026 19:10:30 -0400 (0:00:00.173) 0:13:41.607 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Saturday 21 March 2026 19:10:30 -0400 (0:00:00.214) 0:13:41.821 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Saturday 21 March 2026 19:10:31 -0400 (0:00:00.219) 0:13:42.040 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Saturday 21 March 2026 19:10:31 -0400 (0:00:00.281) 0:13:42.322 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Saturday 21 March 2026 19:10:31 -0400 (0:00:00.201) 0:13:42.523 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Saturday 21 March 2026 19:10:31 -0400 (0:00:00.209) 0:13:42.733 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Saturday 21 March 2026 19:10:32 -0400 (0:00:00.235) 0:13:42.969 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Saturday 21 March 2026 19:10:32 -0400 (0:00:00.228) 0:13:43.198 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Saturday 21 March 2026 19:10:32 -0400 (0:00:00.222) 0:13:43.420 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Saturday 21 March 2026 19:10:32 -0400 (0:00:00.272) 0:13:43.693 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Saturday 21 March 2026 19:10:33 -0400 (0:00:00.162) 0:13:43.856 ******** ok: [managed-node9] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Saturday 21 March 2026 19:10:33 -0400 (0:00:00.204) 0:13:44.061 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Saturday 21 March 2026 19:10:33 -0400 (0:00:00.336) 0:13:44.397 ******** skipping: [managed-node9] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Saturday 21 March 2026 19:10:33 -0400 (0:00:00.171) 0:13:44.569 ******** skipping: [managed-node9] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Saturday 21 March 2026 19:10:34 -0400 (0:00:00.548) 0:13:45.118 ******** skipping: [managed-node9] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Saturday 21 March 2026 19:10:34 -0400 (0:00:00.260) 0:13:45.378 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Saturday 21 March 2026 19:10:34 -0400 (0:00:00.275) 0:13:45.653 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Saturday 21 March 2026 19:10:34 -0400 (0:00:00.158) 0:13:45.812 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Saturday 21 March 2026 19:10:35 -0400 (0:00:00.160) 0:13:45.972 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Saturday 21 March 2026 19:10:35 -0400 (0:00:00.112) 0:13:46.084 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Saturday 21 March 2026 19:10:35 -0400 (0:00:00.151) 0:13:46.236 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Saturday 21 March 2026 19:10:35 -0400 (0:00:00.198) 0:13:46.435 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Saturday 21 March 2026 19:10:35 -0400 (0:00:00.097) 0:13:46.532 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Saturday 21 March 2026 19:10:35 -0400 (0:00:00.145) 0:13:46.677 ******** skipping: [managed-node9] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Saturday 21 March 2026 19:10:35 -0400 (0:00:00.142) 0:13:46.819 ******** skipping: [managed-node9] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Saturday 21 March 2026 19:10:36 -0400 (0:00:00.231) 0:13:47.051 ******** skipping: [managed-node9] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Saturday 21 March 2026 19:10:36 -0400 (0:00:00.242) 0:13:47.294 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Saturday 21 March 2026 19:10:36 -0400 (0:00:00.200) 0:13:47.495 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Saturday 21 March 2026 19:10:36 -0400 (0:00:00.180) 0:13:47.675 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Saturday 21 March 2026 19:10:37 -0400 (0:00:00.173) 0:13:47.848 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Saturday 21 March 2026 19:10:37 -0400 (0:00:00.250) 0:13:48.098 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Saturday 21 March 2026 19:10:37 -0400 (0:00:00.232) 0:13:48.331 ******** ok: [managed-node9] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Saturday 21 March 2026 19:10:37 -0400 (0:00:00.151) 0:13:48.482 ******** ok: [managed-node9] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Saturday 21 March 2026 19:10:37 -0400 (0:00:00.247) 0:13:48.729 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Saturday 21 March 2026 19:10:38 -0400 (0:00:00.156) 0:13:48.886 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Saturday 21 March 2026 19:10:38 -0400 (0:00:00.170) 0:13:49.056 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Saturday 21 March 2026 19:10:38 -0400 (0:00:00.228) 0:13:49.284 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Saturday 21 March 2026 19:10:38 -0400 (0:00:00.105) 0:13:49.390 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Saturday 21 March 2026 19:10:38 -0400 (0:00:00.315) 0:13:49.706 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Saturday 21 March 2026 19:10:39 -0400 (0:00:00.393) 0:13:50.099 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Saturday 21 March 2026 19:10:39 -0400 (0:00:00.251) 0:13:50.351 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Saturday 21 March 2026 19:10:39 -0400 (0:00:00.230) 0:13:50.581 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Saturday 21 March 2026 19:10:39 -0400 (0:00:00.198) 0:13:50.780 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Saturday 21 March 2026 19:10:40 -0400 (0:00:00.170) 0:13:50.951 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Remove the key file] ***************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:342 Saturday 21 March 2026 19:10:40 -0400 (0:00:00.139) 0:13:51.090 ******** ok: [managed-node9] => { "changed": false, "path": "/tmp/storage_testno1nigzclukskey", "state": "absent" } TASK [Test for correct handling of new encrypted volume w/ no key - 3] ********* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:352 Saturday 21 March 2026 19:10:41 -0400 (0:00:01.304) 0:13:52.394 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node9 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Saturday 21 March 2026 19:10:41 -0400 (0:00:00.235) 0:13:52.629 ******** ok: [managed-node9] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Saturday 21 March 2026 19:10:41 -0400 (0:00:00.195) 0:13:52.825 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node9 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:24 Saturday 21 March 2026 19:10:42 -0400 (0:00:00.227) 0:13:53.053 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:34 Saturday 21 March 2026 19:10:42 -0400 (0:00:00.136) 0:13:53.189 ******** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Saturday 21 March 2026 19:10:42 -0400 (0:00:00.195) 0:13:53.385 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Saturday 21 March 2026 19:10:42 -0400 (0:00:00.127) 0:13:53.512 ******** ok: [managed-node9] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Saturday 21 March 2026 19:10:44 -0400 (0:00:02.109) 0:13:55.622 ******** skipping: [managed-node9] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node9] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node9] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node9] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Saturday 21 March 2026 19:10:45 -0400 (0:00:00.462) 0:13:56.084 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Saturday 21 March 2026 19:10:45 -0400 (0:00:00.274) 0:13:56.359 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Saturday 21 March 2026 19:10:45 -0400 (0:00:00.209) 0:13:56.569 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Saturday 21 March 2026 19:10:45 -0400 (0:00:00.151) 0:13:56.721 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Saturday 21 March 2026 19:10:46 -0400 (0:00:00.169) 0:13:56.890 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Saturday 21 March 2026 19:10:46 -0400 (0:00:00.529) 0:13:57.419 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Saturday 21 March 2026 19:10:46 -0400 (0:00:00.348) 0:13:57.768 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Saturday 21 March 2026 19:10:47 -0400 (0:00:00.790) 0:13:58.558 ******** ok: [managed-node9] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kmod-kvdo libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python3-blivet stratis-cli stratisd vdo xfsprogs TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Saturday 21 March 2026 19:10:51 -0400 (0:00:04.230) 0:14:02.789 ******** ok: [managed-node9] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": true, "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Saturday 21 March 2026 19:10:52 -0400 (0:00:00.366) 0:14:03.155 ******** ok: [managed-node9] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Saturday 21 March 2026 19:10:52 -0400 (0:00:00.255) 0:14:03.410 ******** ok: [managed-node9] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup", "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Saturday 21 March 2026 19:10:57 -0400 (0:00:05.052) 0:14:08.463 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Saturday 21 March 2026 19:10:58 -0400 (0:00:00.491) 0:14:08.954 ******** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Saturday 21 March 2026 19:10:58 -0400 (0:00:00.235) 0:14:09.189 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Saturday 21 March 2026 19:10:58 -0400 (0:00:00.208) 0:14:09.397 ******** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Saturday 21 March 2026 19:10:58 -0400 (0:00:00.125) 0:14:09.523 ******** ok: [managed-node9] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup kpartx lvm2 TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Saturday 21 March 2026 19:11:02 -0400 (0:00:03.905) 0:14:13.429 ******** ok: [managed-node9] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Saturday 21 March 2026 19:11:05 -0400 (0:00:02.578) 0:14:16.008 ******** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Saturday 21 March 2026 19:11:05 -0400 (0:00:00.242) 0:14:16.251 ******** fatal: [managed-node9]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: encrypted volume 'test1' missing key/password TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129 Saturday 21 March 2026 19:11:10 -0400 (0:00:04.768) 0:14:21.019 ******** fatal: [managed-node9]: FAILED! => { "changed": false } MSG: {'msg': "encrypted volume 'test1' missing key/password", 'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'invocation': {'module_args': {'pools': [{'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'lvm', 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None}]}], 'volumes': [], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': False, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Saturday 21 March 2026 19:11:10 -0400 (0:00:00.235) 0:14:21.254 ******** TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:22 Saturday 21 March 2026 19:11:10 -0400 (0:00:00.267) 0:14:21.521 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:27 Saturday 21 March 2026 19:11:10 -0400 (0:00:00.185) 0:14:21.707 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:38 Saturday 21 March 2026 19:11:10 -0400 (0:00:00.117) 0:14:21.825 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Create an encrypted lvm volume w/ default fs] **************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:370 Saturday 21 March 2026 19:11:11 -0400 (0:00:00.063) 0:14:21.888 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node9 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:24 Saturday 21 March 2026 19:11:11 -0400 (0:00:00.146) 0:14:22.035 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:34 Saturday 21 March 2026 19:11:11 -0400 (0:00:00.089) 0:14:22.125 ******** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Saturday 21 March 2026 19:11:11 -0400 (0:00:00.045) 0:14:22.171 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Saturday 21 March 2026 19:11:11 -0400 (0:00:00.048) 0:14:22.219 ******** ok: [managed-node9] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Saturday 21 March 2026 19:11:13 -0400 (0:00:01.774) 0:14:23.993 ******** skipping: [managed-node9] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node9] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node9] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node9] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Saturday 21 March 2026 19:11:13 -0400 (0:00:00.266) 0:14:24.260 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Saturday 21 March 2026 19:11:13 -0400 (0:00:00.202) 0:14:24.463 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Saturday 21 March 2026 19:11:13 -0400 (0:00:00.257) 0:14:24.720 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Saturday 21 March 2026 19:11:14 -0400 (0:00:00.124) 0:14:24.845 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Saturday 21 March 2026 19:11:14 -0400 (0:00:00.138) 0:14:24.984 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Saturday 21 March 2026 19:11:14 -0400 (0:00:00.462) 0:14:25.446 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Saturday 21 March 2026 19:11:14 -0400 (0:00:00.121) 0:14:25.568 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Saturday 21 March 2026 19:11:14 -0400 (0:00:00.228) 0:14:25.797 ******** ok: [managed-node9] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kmod-kvdo libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python3-blivet stratis-cli stratisd vdo xfsprogs TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Saturday 21 March 2026 19:11:19 -0400 (0:00:04.310) 0:14:30.108 ******** ok: [managed-node9] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": true, "encryption_cipher": "aes-xts-plain64", "encryption_key_size": 512, "encryption_luks_version": "luks1", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Saturday 21 March 2026 19:11:19 -0400 (0:00:00.387) 0:14:30.495 ******** ok: [managed-node9] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Saturday 21 March 2026 19:11:19 -0400 (0:00:00.313) 0:14:30.808 ******** ok: [managed-node9] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup", "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Saturday 21 March 2026 19:11:25 -0400 (0:00:05.879) 0:14:36.687 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Saturday 21 March 2026 19:11:26 -0400 (0:00:00.283) 0:14:36.971 ******** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Saturday 21 March 2026 19:11:26 -0400 (0:00:00.387) 0:14:37.358 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Saturday 21 March 2026 19:11:26 -0400 (0:00:00.222) 0:14:37.581 ******** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Saturday 21 March 2026 19:11:26 -0400 (0:00:00.138) 0:14:37.720 ******** ok: [managed-node9] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup kpartx lvm2 TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Saturday 21 March 2026 19:11:31 -0400 (0:00:04.190) 0:14:41.910 ******** ok: [managed-node9] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Saturday 21 March 2026 19:11:34 -0400 (0:00:02.935) 0:14:44.845 ******** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Saturday 21 March 2026 19:11:34 -0400 (0:00:00.189) 0:14:45.035 ******** changed: [managed-node9] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-03531268-47a9-4948-bc77-34ba208d0f8e", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-03531268-47a9-4948-bc77-34ba208d0f8e", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "destroy device", "device": "/dev/sda1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create format", "device": "/dev/sda", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/foo", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-f3a44b09-71d6-43a0-aacd-e66622f7a65f", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-f3a44b09-71d6-43a0-aacd-e66622f7a65f", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-03531268-47a9-4948-bc77-34ba208d0f8e", "password": "-", "state": "absent" }, { "backing_device": "/dev/mapper/foo-test1", "name": "luks-f3a44b09-71d6-43a0-aacd-e66622f7a65f", "password": "-", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-f3a44b09-71d6-43a0-aacd-e66622f7a65f" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-03531268-47a9-4948-bc77-34ba208d0f8e", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-f3a44b09-71d6-43a0-aacd-e66622f7a65f", "state": "mounted" } ], "packages": [ "cryptsetup", "lvm2", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-f3a44b09-71d6-43a0-aacd-e66622f7a65f", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-f3a44b09-71d6-43a0-aacd-e66622f7a65f", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": "aes-xts-plain64", "encryption_key": null, "encryption_key_size": 512, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:103 Saturday 21 March 2026 19:11:45 -0400 (0:00:11.488) 0:14:56.524 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:110 Saturday 21 March 2026 19:11:45 -0400 (0:00:00.233) 0:14:56.757 ******** ok: [managed-node9] => { "changed": false, "stat": { "atime": 1774134584.195359, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "f133d4f84703ced9806a3bb3478715b0a182fbc6", "ctime": 1774134584.192359, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 364904581, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1774134584.192359, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1425, "uid": 0, "version": "133446622", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Saturday 21 March 2026 19:11:47 -0400 (0:00:01.337) 0:14:58.095 ******** ok: [managed-node9] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Saturday 21 March 2026 19:11:48 -0400 (0:00:01.411) 0:14:59.506 ******** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:139 Saturday 21 March 2026 19:11:48 -0400 (0:00:00.245) 0:14:59.752 ******** ok: [managed-node9] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-03531268-47a9-4948-bc77-34ba208d0f8e", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-03531268-47a9-4948-bc77-34ba208d0f8e", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "destroy device", "device": "/dev/sda1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create format", "device": "/dev/sda", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/foo", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-f3a44b09-71d6-43a0-aacd-e66622f7a65f", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-f3a44b09-71d6-43a0-aacd-e66622f7a65f", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-03531268-47a9-4948-bc77-34ba208d0f8e", "password": "-", "state": "absent" }, { "backing_device": "/dev/mapper/foo-test1", "name": "luks-f3a44b09-71d6-43a0-aacd-e66622f7a65f", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-f3a44b09-71d6-43a0-aacd-e66622f7a65f" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-03531268-47a9-4948-bc77-34ba208d0f8e", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-f3a44b09-71d6-43a0-aacd-e66622f7a65f", "state": "mounted" } ], "packages": [ "cryptsetup", "lvm2", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-f3a44b09-71d6-43a0-aacd-e66622f7a65f", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-f3a44b09-71d6-43a0-aacd-e66622f7a65f", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": "aes-xts-plain64", "encryption_key": null, "encryption_key_size": 512, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Saturday 21 March 2026 19:11:49 -0400 (0:00:00.194) 0:14:59.947 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-f3a44b09-71d6-43a0-aacd-e66622f7a65f", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-f3a44b09-71d6-43a0-aacd-e66622f7a65f", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": "aes-xts-plain64", "encryption_key": null, "encryption_key_size": 512, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:152 Saturday 21 March 2026 19:11:49 -0400 (0:00:00.277) 0:15:00.224 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:168 Saturday 21 March 2026 19:11:49 -0400 (0:00:00.196) 0:15:00.420 ******** changed: [managed-node9] => (item={'src': '/dev/mapper/luks-03531268-47a9-4948-bc77-34ba208d0f8e', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-03531268-47a9-4948-bc77-34ba208d0f8e", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-03531268-47a9-4948-bc77-34ba208d0f8e" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:179 Saturday 21 March 2026 19:11:50 -0400 (0:00:01.206) 0:15:01.627 ******** ok: [managed-node9] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:184 Saturday 21 March 2026 19:11:52 -0400 (0:00:01.887) 0:15:03.514 ******** changed: [managed-node9] => (item={'src': '/dev/mapper/luks-f3a44b09-71d6-43a0-aacd-e66622f7a65f', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-f3a44b09-71d6-43a0-aacd-e66622f7a65f", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-f3a44b09-71d6-43a0-aacd-e66622f7a65f" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Saturday 21 March 2026 19:11:54 -0400 (0:00:01.695) 0:15:05.210 ******** skipping: [managed-node9] => (item={'src': '/dev/mapper/luks-f3a44b09-71d6-43a0-aacd-e66622f7a65f', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-f3a44b09-71d6-43a0-aacd-e66622f7a65f", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:207 Saturday 21 March 2026 19:11:54 -0400 (0:00:00.321) 0:15:05.531 ******** ok: [managed-node9] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:215 Saturday 21 March 2026 19:11:56 -0400 (0:00:02.051) 0:15:07.582 ******** ok: [managed-node9] => { "changed": false, "stat": { "atime": 1774134596.9804327, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "3832db8958436429dc095df2de379a81e6c4f903", "ctime": 1774134589.6073902, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 436207750, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1774134589.6063902, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 85, "uid": 0, "version": "3254161465", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:220 Saturday 21 March 2026 19:11:58 -0400 (0:00:01.599) 0:15:09.182 ******** changed: [managed-node9] => (item={'backing_device': '/dev/sda1', 'name': 'luks-03531268-47a9-4948-bc77-34ba208d0f8e', 'password': '-', 'state': 'absent'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda1", "name": "luks-03531268-47a9-4948-bc77-34ba208d0f8e", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed changed: [managed-node9] => (item={'backing_device': '/dev/mapper/foo-test1', 'name': 'luks-f3a44b09-71d6-43a0-aacd-e66622f7a65f', 'password': '-', 'state': 'present'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/mapper/foo-test1", "name": "luks-f3a44b09-71d6-43a0-aacd-e66622f7a65f", "password": "-", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:242 Saturday 21 March 2026 19:12:01 -0400 (0:00:02.952) 0:15:12.134 ******** ok: [managed-node9] TASK [Verify role results - 7] ************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:388 Saturday 21 March 2026 19:12:03 -0400 (0:00:02.053) 0:15:14.188 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node9 TASK [Print out pool information] ********************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Saturday 21 March 2026 19:12:03 -0400 (0:00:00.262) 0:15:14.451 ******** ok: [managed-node9] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-f3a44b09-71d6-43a0-aacd-e66622f7a65f", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-f3a44b09-71d6-43a0-aacd-e66622f7a65f", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": "aes-xts-plain64", "encryption_key": null, "encryption_key_size": 512, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Saturday 21 March 2026 19:12:03 -0400 (0:00:00.268) 0:15:14.719 ******** skipping: [managed-node9] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Saturday 21 March 2026 19:12:04 -0400 (0:00:00.183) 0:15:14.903 ******** ok: [managed-node9] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "f3a44b09-71d6-43a0-aacd-e66622f7a65f" }, "/dev/mapper/luks-f3a44b09-71d6-43a0-aacd-e66622f7a65f": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-f3a44b09-71d6-43a0-aacd-e66622f7a65f", "size": "4G", "type": "crypt", "uuid": "3890bc67-451e-4383-bdd1-5d2ad5f20353" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "FV4NCV-y18g-hE3Q-w15a-CQO8-jg2d-pQkOwv" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Saturday 21 March 2026 19:12:05 -0400 (0:00:01.783) 0:15:16.687 ******** ok: [managed-node9] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002998", "end": "2026-03-21 19:12:07.196341", "rc": 0, "start": "2026-03-21 19:12:07.193343" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-f3a44b09-71d6-43a0-aacd-e66622f7a65f /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Saturday 21 March 2026 19:12:07 -0400 (0:00:01.575) 0:15:18.262 ******** ok: [managed-node9] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002651", "end": "2026-03-21 19:12:08.727065", "failed_when_result": false, "rc": 0, "start": "2026-03-21 19:12:08.724414" } STDOUT: luks-f3a44b09-71d6-43a0-aacd-e66622f7a65f /dev/mapper/foo-test1 - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Saturday 21 March 2026 19:12:08 -0400 (0:00:01.539) 0:15:19.802 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node9 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Saturday 21 March 2026 19:12:09 -0400 (0:00:00.418) 0:15:20.220 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Saturday 21 March 2026 19:12:09 -0400 (0:00:00.117) 0:15:20.337 ******** ok: [managed-node9] => { "changed": false, "cmd": [ "vgs", "--noheadings", "--binary", "-o", "shared", "foo" ], "delta": "0:00:00.024900", "end": "2026-03-21 19:12:10.613644", "rc": 0, "start": "2026-03-21 19:12:10.588744" } STDOUT: 0 TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Saturday 21 March 2026 19:12:11 -0400 (0:00:01.921) 0:15:22.259 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Saturday 21 March 2026 19:12:11 -0400 (0:00:00.326) 0:15:22.585 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node9 included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node9 TASK [Set test variables] ****************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Saturday 21 March 2026 19:12:12 -0400 (0:00:00.415) 0:15:23.001 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Saturday 21 March 2026 19:12:12 -0400 (0:00:00.292) 0:15:23.293 ******** ok: [managed-node9] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Saturday 21 March 2026 19:12:14 -0400 (0:00:02.546) 0:15:25.839 ******** ok: [managed-node9] => { "ansible_facts": { "__pvs_lvm_len": "1" }, "changed": false } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Saturday 21 March 2026 19:12:15 -0400 (0:00:00.247) 0:15:26.087 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "changed": false } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Saturday 21 March 2026 19:12:15 -0400 (0:00:00.393) 0:15:26.480 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Set expected pv type] **************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Saturday 21 March 2026 19:12:16 -0400 (0:00:00.391) 0:15:26.872 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type - 2] ************************************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Saturday 21 March 2026 19:12:16 -0400 (0:00:00.268) 0:15:27.141 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type - 3] ************************************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Saturday 21 March 2026 19:12:16 -0400 (0:00:00.327) 0:15:27.468 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:55 Saturday 21 March 2026 19:12:16 -0400 (0:00:00.301) 0:15:27.770 ******** ok: [managed-node9] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:68 Saturday 21 March 2026 19:12:17 -0400 (0:00:00.363) 0:15:28.134 ******** ok: [managed-node9] => { "changed": false, "failed_when_result": false, "rc": 1 } STDERR: Shared connection to 10.31.10.97 closed. MSG: non-zero return code TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:78 Saturday 21 March 2026 19:12:18 -0400 (0:00:01.547) 0:15:29.682 ******** skipping: [managed-node9] => (item=/dev/sda) => { "ansible_loop_var": "st_pool_pv", "changed": false, "skip_reason": "Conditional result was False", "st_pool_pv": "/dev/sda" } TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:88 Saturday 21 March 2026 19:12:19 -0400 (0:00:00.306) 0:15:29.988 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node9 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Saturday 21 March 2026 19:12:19 -0400 (0:00:00.520) 0:15:30.509 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Saturday 21 March 2026 19:12:19 -0400 (0:00:00.290) 0:15:30.799 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Saturday 21 March 2026 19:12:20 -0400 (0:00:00.276) 0:15:31.076 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Saturday 21 March 2026 19:12:20 -0400 (0:00:00.234) 0:15:31.310 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Saturday 21 March 2026 19:12:20 -0400 (0:00:00.234) 0:15:31.545 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Saturday 21 March 2026 19:12:20 -0400 (0:00:00.278) 0:15:31.824 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Saturday 21 March 2026 19:12:21 -0400 (0:00:00.273) 0:15:32.097 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Saturday 21 March 2026 19:12:21 -0400 (0:00:00.187) 0:15:32.285 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Saturday 21 March 2026 19:12:21 -0400 (0:00:00.203) 0:15:32.488 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Saturday 21 March 2026 19:12:21 -0400 (0:00:00.177) 0:15:32.665 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Saturday 21 March 2026 19:12:22 -0400 (0:00:00.193) 0:15:32.859 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:91 Saturday 21 March 2026 19:12:22 -0400 (0:00:00.194) 0:15:33.054 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node9 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Saturday 21 March 2026 19:12:22 -0400 (0:00:00.448) 0:15:33.503 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml for managed-node9 TASK [Get information about the LV] ******************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:8 Saturday 21 March 2026 19:12:23 -0400 (0:00:00.406) 0:15:33.909 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:16 Saturday 21 March 2026 19:12:23 -0400 (0:00:00.314) 0:15:34.224 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:20 Saturday 21 March 2026 19:12:23 -0400 (0:00:00.223) 0:15:34.447 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV stripe size] ****************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:27 Saturday 21 March 2026 19:12:23 -0400 (0:00:00.234) 0:15:34.682 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested stripe size] ***************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:31 Saturday 21 March 2026 19:12:24 -0400 (0:00:00.202) 0:15:34.885 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected stripe size] ************************************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:37 Saturday 21 March 2026 19:12:24 -0400 (0:00:00.270) 0:15:35.155 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check stripe size] ******************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:42 Saturday 21 March 2026 19:12:24 -0400 (0:00:00.224) 0:15:35.380 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:94 Saturday 21 March 2026 19:12:24 -0400 (0:00:00.236) 0:15:35.616 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node9 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Saturday 21 March 2026 19:12:25 -0400 (0:00:00.422) 0:15:36.039 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml for managed-node9 TASK [Get information about thinpool] ****************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:8 Saturday 21 March 2026 19:12:25 -0400 (0:00:00.310) 0:15:36.349 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in correct thinpool (when thinp name is provided)] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:16 Saturday 21 March 2026 19:12:25 -0400 (0:00:00.157) 0:15:36.506 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in thinpool (when thinp name is not provided)] ****** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:22 Saturday 21 March 2026 19:12:25 -0400 (0:00:00.285) 0:15:36.792 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:26 Saturday 21 March 2026 19:12:26 -0400 (0:00:00.191) 0:15:36.984 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_thin_status": null }, "changed": false } TASK [Check member encryption] ************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:97 Saturday 21 March 2026 19:12:26 -0400 (0:00:00.301) 0:15:37.285 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node9 TASK [Set test variables] ****************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Saturday 21 March 2026 19:12:26 -0400 (0:00:00.527) 0:15:37.813 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Saturday 21 March 2026 19:12:27 -0400 (0:00:00.283) 0:15:38.096 ******** skipping: [managed-node9] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Saturday 21 March 2026 19:12:27 -0400 (0:00:00.251) 0:15:38.347 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml for managed-node9 TASK [Set variables used by tests] ********************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:2 Saturday 21 March 2026 19:12:27 -0400 (0:00:00.418) 0:15:38.766 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:6 Saturday 21 March 2026 19:12:28 -0400 (0:00:00.305) 0:15:39.072 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:14 Saturday 21 March 2026 19:12:28 -0400 (0:00:00.308) 0:15:39.380 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:23 Saturday 21 March 2026 19:12:28 -0400 (0:00:00.289) 0:15:39.670 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:32 Saturday 21 March 2026 19:12:29 -0400 (0:00:00.314) 0:15:39.984 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:41 Saturday 21 March 2026 19:12:29 -0400 (0:00:00.297) 0:15:40.282 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [Clear test variables] **************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Saturday 21 March 2026 19:12:29 -0400 (0:00:00.288) 0:15:40.570 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:100 Saturday 21 March 2026 19:12:29 -0400 (0:00:00.250) 0:15:40.821 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node9 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Saturday 21 March 2026 19:12:30 -0400 (0:00:00.517) 0:15:41.339 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml for managed-node9 TASK [Get information about VDO deduplication] ********************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:8 Saturday 21 March 2026 19:12:31 -0400 (0:00:00.624) 0:15:41.963 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:15 Saturday 21 March 2026 19:12:31 -0400 (0:00:00.273) 0:15:42.236 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:21 Saturday 21 March 2026 19:12:31 -0400 (0:00:00.286) 0:15:42.523 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about VDO compression] *********************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:27 Saturday 21 March 2026 19:12:32 -0400 (0:00:00.381) 0:15:42.904 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off - 2] *********************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:34 Saturday 21 March 2026 19:12:32 -0400 (0:00:00.319) 0:15:43.224 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on - 2] ************************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:40 Saturday 21 March 2026 19:12:32 -0400 (0:00:00.303) 0:15:43.527 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:46 Saturday 21 March 2026 19:12:32 -0400 (0:00:00.210) 0:15:43.738 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:103 Saturday 21 March 2026 19:12:33 -0400 (0:00:00.284) 0:15:44.022 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node9 TASK [Get stratis pool information] ******************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Saturday 21 March 2026 19:12:33 -0400 (0:00:00.691) 0:15:44.714 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print script output] ***************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Saturday 21 March 2026 19:12:34 -0400 (0:00:00.280) 0:15:44.994 ******** skipping: [managed-node9] => {} TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:19 Saturday 21 March 2026 19:12:34 -0400 (0:00:00.288) 0:15:45.283 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:23 Saturday 21 March 2026 19:12:34 -0400 (0:00:00.253) 0:15:45.537 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:30 Saturday 21 March 2026 19:12:35 -0400 (0:00:00.334) 0:15:45.871 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:39 Saturday 21 March 2026 19:12:35 -0400 (0:00:00.227) 0:15:46.099 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:49 Saturday 21 March 2026 19:12:35 -0400 (0:00:00.261) 0:15:46.360 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:106 Saturday 21 March 2026 19:12:35 -0400 (0:00:00.222) 0:15:46.583 ******** ok: [managed-node9] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Saturday 21 March 2026 19:12:36 -0400 (0:00:00.271) 0:15:46.854 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node9 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Saturday 21 March 2026 19:12:36 -0400 (0:00:00.450) 0:15:47.305 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Saturday 21 March 2026 19:12:36 -0400 (0:00:00.341) 0:15:47.647 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node9 included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node9 included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node9 included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node9 included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node9 included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node9 included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node9 included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node9 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Saturday 21 March 2026 19:12:38 -0400 (0:00:01.255) 0:15:48.903 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-f3a44b09-71d6-43a0-aacd-e66622f7a65f" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Saturday 21 March 2026 19:12:38 -0400 (0:00:00.289) 0:15:49.193 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Saturday 21 March 2026 19:12:39 -0400 (0:00:01.487) 0:15:50.680 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Saturday 21 March 2026 19:12:40 -0400 (0:00:00.277) 0:15:50.957 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Saturday 21 March 2026 19:12:40 -0400 (0:00:00.270) 0:15:51.228 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Saturday 21 March 2026 19:12:40 -0400 (0:00:00.363) 0:15:51.591 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Saturday 21 March 2026 19:12:40 -0400 (0:00:00.239) 0:15:51.831 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Saturday 21 March 2026 19:12:41 -0400 (0:00:00.344) 0:15:52.175 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Saturday 21 March 2026 19:12:41 -0400 (0:00:00.330) 0:15:52.506 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Saturday 21 March 2026 19:12:41 -0400 (0:00:00.241) 0:15:52.747 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Saturday 21 March 2026 19:12:42 -0400 (0:00:00.327) 0:15:53.074 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Saturday 21 March 2026 19:12:42 -0400 (0:00:00.285) 0:15:53.360 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-f3a44b09-71d6-43a0-aacd-e66622f7a65f " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Saturday 21 March 2026 19:12:43 -0400 (0:00:00.522) 0:15:53.882 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Saturday 21 March 2026 19:12:43 -0400 (0:00:00.342) 0:15:54.225 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Saturday 21 March 2026 19:12:43 -0400 (0:00:00.262) 0:15:54.487 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Saturday 21 March 2026 19:12:43 -0400 (0:00:00.324) 0:15:54.812 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Saturday 21 March 2026 19:12:44 -0400 (0:00:00.314) 0:15:55.127 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Saturday 21 March 2026 19:12:44 -0400 (0:00:00.150) 0:15:55.277 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Saturday 21 March 2026 19:12:44 -0400 (0:00:00.307) 0:15:55.585 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Saturday 21 March 2026 19:12:45 -0400 (0:00:00.309) 0:15:55.895 ******** ok: [managed-node9] => { "changed": false, "stat": { "atime": 1774134705.2170587, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1774134705.2170587, "dev": 6, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 265193, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1774134705.2170587, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Saturday 21 March 2026 19:12:46 -0400 (0:00:01.708) 0:15:57.603 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Saturday 21 March 2026 19:12:47 -0400 (0:00:00.284) 0:15:57.888 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Saturday 21 March 2026 19:12:47 -0400 (0:00:00.098) 0:15:57.986 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Saturday 21 March 2026 19:12:47 -0400 (0:00:00.142) 0:15:58.129 ******** ok: [managed-node9] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Saturday 21 March 2026 19:12:47 -0400 (0:00:00.266) 0:15:58.395 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Saturday 21 March 2026 19:12:47 -0400 (0:00:00.289) 0:15:58.685 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Saturday 21 March 2026 19:12:48 -0400 (0:00:00.234) 0:15:58.920 ******** ok: [managed-node9] => { "changed": false, "stat": { "atime": 1774134705.3660595, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1774134705.3660595, "dev": 6, "device_type": 64769, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 266385, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1774134705.3660595, "nlink": 1, "path": "/dev/mapper/luks-f3a44b09-71d6-43a0-aacd-e66622f7a65f", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Saturday 21 March 2026 19:12:49 -0400 (0:00:01.483) 0:16:00.403 ******** ok: [managed-node9] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Saturday 21 March 2026 19:12:54 -0400 (0:00:04.574) 0:16:04.978 ******** ok: [managed-node9] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/mapper/foo-test1" ], "delta": "0:00:00.009717", "end": "2026-03-21 19:12:55.506731", "rc": 0, "start": "2026-03-21 19:12:55.497014" } STDOUT: LUKS header information for /dev/mapper/foo-test1 Version: 1 Cipher name: aes Cipher mode: xts-plain64 Hash spec: sha256 Payload offset: 16384 MK bits: 512 MK digest: 14 9b 43 bd 00 fb 8f 3d 95 ee 53 23 cc 53 b9 f5 79 03 16 a4 MK salt: 7a a3 c7 53 5a 94 e2 50 84 d7 36 38 4e 79 44 7c ac 9d 50 38 0a 7f 2c 89 75 5b 28 a6 a1 26 9e 58 MK iterations: 120470 UUID: f3a44b09-71d6-43a0-aacd-e66622f7a65f Key Slot 0: ENABLED Iterations: 1923992 Salt: d4 94 a1 a8 58 a7 20 24 93 b8 63 f3 81 1e c4 66 10 30 31 89 b7 ac 4d 96 29 06 75 a9 d9 2b ae 84 Key material offset: 8 AF stripes: 4000 Key Slot 1: DISABLED Key Slot 2: DISABLED Key Slot 3: DISABLED Key Slot 4: DISABLED Key Slot 5: DISABLED Key Slot 6: DISABLED Key Slot 7: DISABLED TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Saturday 21 March 2026 19:12:55 -0400 (0:00:01.623) 0:16:06.601 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Saturday 21 March 2026 19:12:55 -0400 (0:00:00.216) 0:16:06.817 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Saturday 21 March 2026 19:12:56 -0400 (0:00:00.376) 0:16:07.194 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Saturday 21 March 2026 19:12:56 -0400 (0:00:00.225) 0:16:07.420 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Saturday 21 March 2026 19:12:56 -0400 (0:00:00.244) 0:16:07.664 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Saturday 21 March 2026 19:12:57 -0400 (0:00:00.325) 0:16:07.989 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Saturday 21 March 2026 19:12:57 -0400 (0:00:00.401) 0:16:08.391 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Set test variables] ****************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Saturday 21 March 2026 19:12:57 -0400 (0:00:00.220) 0:16:08.612 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-f3a44b09-71d6-43a0-aacd-e66622f7a65f /dev/mapper/foo-test1 -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Saturday 21 March 2026 19:12:58 -0400 (0:00:00.251) 0:16:08.863 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Saturday 21 March 2026 19:12:58 -0400 (0:00:00.259) 0:16:09.123 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Saturday 21 March 2026 19:12:58 -0400 (0:00:00.222) 0:16:09.345 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Saturday 21 March 2026 19:12:58 -0400 (0:00:00.181) 0:16:09.527 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Saturday 21 March 2026 19:12:58 -0400 (0:00:00.195) 0:16:09.722 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Saturday 21 March 2026 19:12:59 -0400 (0:00:00.199) 0:16:09.922 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Saturday 21 March 2026 19:12:59 -0400 (0:00:00.321) 0:16:10.244 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Saturday 21 March 2026 19:12:59 -0400 (0:00:00.283) 0:16:10.527 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Saturday 21 March 2026 19:12:59 -0400 (0:00:00.289) 0:16:10.816 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Saturday 21 March 2026 19:13:00 -0400 (0:00:00.206) 0:16:11.023 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Saturday 21 March 2026 19:13:00 -0400 (0:00:00.252) 0:16:11.276 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Saturday 21 March 2026 19:13:00 -0400 (0:00:00.283) 0:16:11.559 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Saturday 21 March 2026 19:13:00 -0400 (0:00:00.179) 0:16:11.739 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Saturday 21 March 2026 19:13:01 -0400 (0:00:00.236) 0:16:11.975 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Saturday 21 March 2026 19:13:01 -0400 (0:00:00.175) 0:16:12.151 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Saturday 21 March 2026 19:13:01 -0400 (0:00:00.281) 0:16:12.433 ******** ok: [managed-node9] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Saturday 21 March 2026 19:13:04 -0400 (0:00:03.065) 0:16:15.498 ******** ok: [managed-node9] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Saturday 21 March 2026 19:13:05 -0400 (0:00:01.163) 0:16:16.662 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [Show expected size] ****************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Saturday 21 March 2026 19:13:06 -0400 (0:00:00.270) 0:16:16.933 ******** ok: [managed-node9] => { "storage_test_expected_size": "4294967296" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Saturday 21 March 2026 19:13:06 -0400 (0:00:00.213) 0:16:17.146 ******** ok: [managed-node9] => { "bytes": 10726680821, "changed": false, "lvm": "9g", "parted": "9GiB", "size": "9 GiB" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Saturday 21 March 2026 19:13:07 -0400 (0:00:01.399) 0:16:18.545 ******** skipping: [managed-node9] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Saturday 21 March 2026 19:13:07 -0400 (0:00:00.134) 0:16:18.680 ******** skipping: [managed-node9] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Saturday 21 March 2026 19:13:08 -0400 (0:00:00.295) 0:16:18.975 ******** skipping: [managed-node9] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Saturday 21 March 2026 19:13:08 -0400 (0:00:00.330) 0:16:19.306 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Saturday 21 March 2026 19:13:08 -0400 (0:00:00.253) 0:16:19.560 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Saturday 21 March 2026 19:13:08 -0400 (0:00:00.201) 0:16:19.762 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Saturday 21 March 2026 19:13:09 -0400 (0:00:00.214) 0:16:19.977 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Saturday 21 March 2026 19:13:09 -0400 (0:00:00.265) 0:16:20.243 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Saturday 21 March 2026 19:13:09 -0400 (0:00:00.274) 0:16:20.517 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Saturday 21 March 2026 19:13:10 -0400 (0:00:00.353) 0:16:20.870 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Saturday 21 March 2026 19:13:10 -0400 (0:00:00.325) 0:16:21.196 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Saturday 21 March 2026 19:13:10 -0400 (0:00:00.331) 0:16:21.528 ******** skipping: [managed-node9] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Saturday 21 March 2026 19:13:10 -0400 (0:00:00.255) 0:16:21.783 ******** skipping: [managed-node9] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Saturday 21 March 2026 19:13:11 -0400 (0:00:00.258) 0:16:22.042 ******** skipping: [managed-node9] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Saturday 21 March 2026 19:13:11 -0400 (0:00:00.244) 0:16:22.286 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Saturday 21 March 2026 19:13:11 -0400 (0:00:00.248) 0:16:22.535 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Saturday 21 March 2026 19:13:11 -0400 (0:00:00.264) 0:16:22.799 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Saturday 21 March 2026 19:13:12 -0400 (0:00:00.286) 0:16:23.086 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Saturday 21 March 2026 19:13:12 -0400 (0:00:00.205) 0:16:23.292 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Saturday 21 March 2026 19:13:12 -0400 (0:00:00.228) 0:16:23.520 ******** ok: [managed-node9] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Saturday 21 March 2026 19:13:12 -0400 (0:00:00.197) 0:16:23.717 ******** ok: [managed-node9] => { "storage_test_expected_size": "4294967296" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Saturday 21 March 2026 19:13:13 -0400 (0:00:00.207) 0:16:23.925 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Saturday 21 March 2026 19:13:13 -0400 (0:00:00.212) 0:16:24.137 ******** ok: [managed-node9] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.024391", "end": "2026-03-21 19:13:14.544589", "rc": 0, "start": "2026-03-21 19:13:14.520198" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Saturday 21 March 2026 19:13:14 -0400 (0:00:01.430) 0:16:25.568 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [Check segment type] ****************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Saturday 21 March 2026 19:13:14 -0400 (0:00:00.226) 0:16:25.795 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Saturday 21 March 2026 19:13:15 -0400 (0:00:00.304) 0:16:26.099 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Saturday 21 March 2026 19:13:15 -0400 (0:00:00.250) 0:16:26.350 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Saturday 21 March 2026 19:13:15 -0400 (0:00:00.227) 0:16:26.578 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Saturday 21 March 2026 19:13:15 -0400 (0:00:00.210) 0:16:26.789 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Saturday 21 March 2026 19:13:16 -0400 (0:00:00.222) 0:16:27.011 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Saturday 21 March 2026 19:13:16 -0400 (0:00:00.288) 0:16:27.300 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Saturday 21 March 2026 19:13:16 -0400 (0:00:00.251) 0:16:27.552 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Verify preservation of encryption settings on existing LVM volume] ******* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:391 Saturday 21 March 2026 19:13:16 -0400 (0:00:00.171) 0:16:27.723 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node9 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:24 Saturday 21 March 2026 19:13:17 -0400 (0:00:00.442) 0:16:28.166 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:34 Saturday 21 March 2026 19:13:17 -0400 (0:00:00.190) 0:16:28.357 ******** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Saturday 21 March 2026 19:13:17 -0400 (0:00:00.205) 0:16:28.562 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Saturday 21 March 2026 19:13:17 -0400 (0:00:00.130) 0:16:28.693 ******** ok: [managed-node9] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Saturday 21 March 2026 19:13:20 -0400 (0:00:02.258) 0:16:30.972 ******** skipping: [managed-node9] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node9] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node9] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node9] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Saturday 21 March 2026 19:13:20 -0400 (0:00:00.332) 0:16:31.304 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Saturday 21 March 2026 19:13:20 -0400 (0:00:00.156) 0:16:31.460 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Saturday 21 March 2026 19:13:20 -0400 (0:00:00.188) 0:16:31.649 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Saturday 21 March 2026 19:13:20 -0400 (0:00:00.192) 0:16:31.841 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Saturday 21 March 2026 19:13:21 -0400 (0:00:00.161) 0:16:32.002 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Saturday 21 March 2026 19:13:21 -0400 (0:00:00.356) 0:16:32.358 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Saturday 21 March 2026 19:13:21 -0400 (0:00:00.238) 0:16:32.597 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Saturday 21 March 2026 19:13:21 -0400 (0:00:00.162) 0:16:32.759 ******** ok: [managed-node9] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kmod-kvdo libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python3-blivet stratis-cli stratisd vdo xfsprogs TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Saturday 21 March 2026 19:13:26 -0400 (0:00:04.433) 0:16:37.193 ******** ok: [managed-node9] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Saturday 21 March 2026 19:13:26 -0400 (0:00:00.253) 0:16:37.446 ******** ok: [managed-node9] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Saturday 21 March 2026 19:13:26 -0400 (0:00:00.318) 0:16:37.764 ******** ok: [managed-node9] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Saturday 21 March 2026 19:13:32 -0400 (0:00:05.661) 0:16:43.426 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Saturday 21 March 2026 19:13:32 -0400 (0:00:00.252) 0:16:43.678 ******** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Saturday 21 March 2026 19:13:32 -0400 (0:00:00.122) 0:16:43.801 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Saturday 21 March 2026 19:13:33 -0400 (0:00:00.274) 0:16:44.075 ******** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Saturday 21 March 2026 19:13:33 -0400 (0:00:00.232) 0:16:44.307 ******** ok: [managed-node9] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kpartx lvm2 TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Saturday 21 March 2026 19:13:38 -0400 (0:00:04.615) 0:16:48.922 ******** ok: [managed-node9] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luk...d47a9\\x2d4948\\x2dbc77\\x2d34ba208d0f8e.service": { "name": "systemd-cryptsetup@luk...d47a9\\x2d4948\\x2dbc77\\x2d34ba208d0f8e.service", "source": "systemd", "state": "inactive", "status": "generated" }, "systemd-cryptsetup@luks\\x2d03531268\\x2d47a9\\x2d4948\\x2dbc77\\x2d34ba208d0f8e.service": { "name": "systemd-cryptsetup@luks\\x2d03531268\\x2d47a9\\x2d4948\\x2dbc77\\x2d34ba208d0f8e.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Saturday 21 March 2026 19:13:40 -0400 (0:00:02.782) 0:16:51.704 ******** changed: [managed-node9] => (item=systemd-cryptsetup@luks\x2d03531268\x2d47a9\x2d4948\x2dbc77\x2d34ba208d0f8e.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d03531268\\x2d47a9\\x2d4948\\x2dbc77\\x2d34ba208d0f8e.service", "name": "systemd-cryptsetup@luks\\x2d03531268\\x2d47a9\\x2d4948\\x2dbc77\\x2d34ba208d0f8e.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "dev-sda1.device cryptsetup-pre.target tmp.mount system-systemd\\x2dcryptsetup.slice -.mount systemd-journald.socket", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "umount.target cryptsetup.target", "BindsTo": "dev-sda1.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-03531268-47a9-4948-bc77-34ba208d0f8e", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-03531268-47a9-4948-bc77-34ba208d0f8e /dev/sda1 /tmp/storage_testno1nigzclukskey ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-03531268-47a9-4948-bc77-34ba208d0f8e ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d03531268\\x2d47a9\\x2d4948\\x2dbc77\\x2d34ba208d0f8e.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2d03531268\\x2d47a9\\x2d4948\\x2dbc77\\x2d34ba208d0f8e.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2d03531268\\x2d47a9\\x2d4948\\x2dbc77\\x2d34ba208d0f8e.service", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target", "Requires": "-.mount system-systemd\\x2dcryptsetup.slice", "RequiresMountsFor": "/tmp/storage_testno1nigzclukskey", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Sat 2026-03-21 19:11:56 EDT", "StateChangeTimestampMonotonic": "2479413505", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "infinity", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "WantedBy": "dev-sda1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node9] => (item=systemd-cryptsetup@luk...d47a9\x2d4948\x2dbc77\x2d34ba208d0f8e.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...d47a9\\x2d4948\\x2dbc77\\x2d34ba208d0f8e.service", "name": "systemd-cryptsetup@luk...d47a9\\x2d4948\\x2dbc77\\x2d34ba208d0f8e.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...d47a9\\x2d4948\\x2dbc77\\x2d34ba208d0f8e.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...d47a9\\x2d4948\\x2dbc77\\x2d34ba208d0f8e.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.NoSuchUnit \"Unit systemd-cryptsetup@luk...d47a9\\x2d4948\\x2dbc77\\x2d34ba208d0f8e.service not found.\"", "LoadState": "not-found", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...d47a9\\x2d4948\\x2dbc77\\x2d34ba208d0f8e.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Saturday 21 March 2026 19:13:44 -0400 (0:00:03.779) 0:16:55.484 ******** ok: [managed-node9] => { "actions": [], "changed": false, "crypts": [], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-f3a44b09-71d6-43a0-aacd-e66622f7a65f" ], "mounts": [ { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-f3a44b09-71d6-43a0-aacd-e66622f7a65f", "state": "mounted" } ], "packages": [ "cryptsetup", "lvm2", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-f3a44b09-71d6-43a0-aacd-e66622f7a65f", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-f3a44b09-71d6-43a0-aacd-e66622f7a65f", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:103 Saturday 21 March 2026 19:13:50 -0400 (0:00:05.862) 0:17:01.346 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:110 Saturday 21 March 2026 19:13:50 -0400 (0:00:00.302) 0:17:01.649 ******** ok: [managed-node9] => { "changed": false, "stat": { "atime": 1774134714.0721102, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "5b0e0b19bd21f83e98347073085dda97746c1758", "ctime": 1774134714.0691102, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 364904581, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1774134714.0691102, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1425, "uid": 0, "version": "133446622", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Saturday 21 March 2026 19:13:52 -0400 (0:00:01.390) 0:17:03.039 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Saturday 21 March 2026 19:13:52 -0400 (0:00:00.201) 0:17:03.240 ******** changed: [managed-node9] => (item=systemd-cryptsetup@luks\x2d03531268\x2d47a9\x2d4948\x2dbc77\x2d34ba208d0f8e.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d03531268\\x2d47a9\\x2d4948\\x2dbc77\\x2d34ba208d0f8e.service", "name": "systemd-cryptsetup@luks\\x2d03531268\\x2d47a9\\x2d4948\\x2dbc77\\x2d34ba208d0f8e.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d03531268\\x2d47a9\\x2d4948\\x2dbc77\\x2d34ba208d0f8e.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2d03531268\\x2d47a9\\x2d4948\\x2dbc77\\x2d34ba208d0f8e.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2d03531268\\x2d47a9\\x2d4948\\x2dbc77\\x2d34ba208d0f8e.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2d03531268\\x2d47a9\\x2d4948\\x2dbc77\\x2d34ba208d0f8e.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node9] => (item=systemd-cryptsetup@luk...d47a9\x2d4948\x2dbc77\x2d34ba208d0f8e.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...d47a9\\x2d4948\\x2dbc77\\x2d34ba208d0f8e.service", "name": "systemd-cryptsetup@luk...d47a9\\x2d4948\\x2dbc77\\x2d34ba208d0f8e.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...d47a9\\x2d4948\\x2dbc77\\x2d34ba208d0f8e.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...d47a9\\x2d4948\\x2dbc77\\x2d34ba208d0f8e.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luk...d47a9\\x2d4948\\x2dbc77\\x2d34ba208d0f8e.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...d47a9\\x2d4948\\x2dbc77\\x2d34ba208d0f8e.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:139 Saturday 21 March 2026 19:13:55 -0400 (0:00:03.414) 0:17:06.654 ******** ok: [managed-node9] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-f3a44b09-71d6-43a0-aacd-e66622f7a65f" ], "mounts": [ { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-f3a44b09-71d6-43a0-aacd-e66622f7a65f", "state": "mounted" } ], "packages": [ "cryptsetup", "lvm2", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-f3a44b09-71d6-43a0-aacd-e66622f7a65f", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-f3a44b09-71d6-43a0-aacd-e66622f7a65f", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Saturday 21 March 2026 19:13:55 -0400 (0:00:00.178) 0:17:06.833 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-f3a44b09-71d6-43a0-aacd-e66622f7a65f", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-f3a44b09-71d6-43a0-aacd-e66622f7a65f", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:152 Saturday 21 March 2026 19:13:56 -0400 (0:00:00.792) 0:17:07.625 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:168 Saturday 21 March 2026 19:13:56 -0400 (0:00:00.164) 0:17:07.790 ******** TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:179 Saturday 21 March 2026 19:13:57 -0400 (0:00:00.084) 0:17:07.874 ******** ok: [managed-node9] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:184 Saturday 21 March 2026 19:13:58 -0400 (0:00:01.209) 0:17:09.084 ******** ok: [managed-node9] => (item={'src': '/dev/mapper/luks-f3a44b09-71d6-43a0-aacd-e66622f7a65f', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-f3a44b09-71d6-43a0-aacd-e66622f7a65f", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-f3a44b09-71d6-43a0-aacd-e66622f7a65f" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Saturday 21 March 2026 19:13:59 -0400 (0:00:01.346) 0:17:10.431 ******** skipping: [managed-node9] => (item={'src': '/dev/mapper/luks-f3a44b09-71d6-43a0-aacd-e66622f7a65f', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-f3a44b09-71d6-43a0-aacd-e66622f7a65f", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:207 Saturday 21 March 2026 19:13:59 -0400 (0:00:00.250) 0:17:10.682 ******** ok: [managed-node9] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:215 Saturday 21 March 2026 19:14:01 -0400 (0:00:01.576) 0:17:12.259 ******** ok: [managed-node9] => { "changed": false, "stat": { "atime": 1774134728.725195, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "dabe5e473594e927685ea32d770f243b4f18772e", "ctime": 1774134721.1231508, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 39846086, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1774134721.122151, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 66, "uid": 0, "version": "3488844892", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:220 Saturday 21 March 2026 19:14:02 -0400 (0:00:01.403) 0:17:13.663 ******** TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:242 Saturday 21 March 2026 19:14:02 -0400 (0:00:00.139) 0:17:13.802 ******** ok: [managed-node9] TASK [Assert preservation of encryption settings on existing LVM volume] ******* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:404 Saturday 21 March 2026 19:14:04 -0400 (0:00:01.767) 0:17:15.569 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify role results - 8] ************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:411 Saturday 21 March 2026 19:14:04 -0400 (0:00:00.177) 0:17:15.747 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node9 TASK [Print out pool information] ********************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Saturday 21 March 2026 19:14:05 -0400 (0:00:00.340) 0:17:16.088 ******** ok: [managed-node9] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-f3a44b09-71d6-43a0-aacd-e66622f7a65f", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-f3a44b09-71d6-43a0-aacd-e66622f7a65f", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Saturday 21 March 2026 19:14:05 -0400 (0:00:00.293) 0:17:16.381 ******** skipping: [managed-node9] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Saturday 21 March 2026 19:14:05 -0400 (0:00:00.217) 0:17:16.598 ******** ok: [managed-node9] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "f3a44b09-71d6-43a0-aacd-e66622f7a65f" }, "/dev/mapper/luks-f3a44b09-71d6-43a0-aacd-e66622f7a65f": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-f3a44b09-71d6-43a0-aacd-e66622f7a65f", "size": "4G", "type": "crypt", "uuid": "3890bc67-451e-4383-bdd1-5d2ad5f20353" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "FV4NCV-y18g-hE3Q-w15a-CQO8-jg2d-pQkOwv" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Saturday 21 March 2026 19:14:07 -0400 (0:00:01.448) 0:17:18.046 ******** ok: [managed-node9] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002613", "end": "2026-03-21 19:14:08.429826", "rc": 0, "start": "2026-03-21 19:14:08.427213" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-f3a44b09-71d6-43a0-aacd-e66622f7a65f /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Saturday 21 March 2026 19:14:08 -0400 (0:00:01.625) 0:17:19.672 ******** ok: [managed-node9] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002627", "end": "2026-03-21 19:14:09.627993", "failed_when_result": false, "rc": 0, "start": "2026-03-21 19:14:09.625366" } STDOUT: luks-f3a44b09-71d6-43a0-aacd-e66622f7a65f /dev/mapper/foo-test1 - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Saturday 21 March 2026 19:14:09 -0400 (0:00:00.964) 0:17:20.636 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node9 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Saturday 21 March 2026 19:14:10 -0400 (0:00:00.295) 0:17:20.932 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Saturday 21 March 2026 19:14:10 -0400 (0:00:00.181) 0:17:21.114 ******** ok: [managed-node9] => { "changed": false, "cmd": [ "vgs", "--noheadings", "--binary", "-o", "shared", "foo" ], "delta": "0:00:00.027543", "end": "2026-03-21 19:14:11.446604", "rc": 0, "start": "2026-03-21 19:14:11.419061" } STDOUT: 0 TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Saturday 21 March 2026 19:14:11 -0400 (0:00:01.478) 0:17:22.592 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Saturday 21 March 2026 19:14:11 -0400 (0:00:00.240) 0:17:22.833 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node9 included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node9 TASK [Set test variables] ****************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Saturday 21 March 2026 19:14:12 -0400 (0:00:00.411) 0:17:23.244 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Saturday 21 March 2026 19:14:12 -0400 (0:00:00.197) 0:17:23.442 ******** ok: [managed-node9] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Saturday 21 March 2026 19:14:13 -0400 (0:00:00.987) 0:17:24.429 ******** ok: [managed-node9] => { "ansible_facts": { "__pvs_lvm_len": "1" }, "changed": false } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Saturday 21 March 2026 19:14:13 -0400 (0:00:00.175) 0:17:24.604 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "changed": false } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Saturday 21 March 2026 19:14:13 -0400 (0:00:00.101) 0:17:24.706 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Set expected pv type] **************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Saturday 21 March 2026 19:14:14 -0400 (0:00:00.170) 0:17:24.877 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type - 2] ************************************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Saturday 21 March 2026 19:14:14 -0400 (0:00:00.202) 0:17:25.079 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type - 3] ************************************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Saturday 21 March 2026 19:14:14 -0400 (0:00:00.151) 0:17:25.231 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:55 Saturday 21 March 2026 19:14:14 -0400 (0:00:00.260) 0:17:25.491 ******** ok: [managed-node9] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:68 Saturday 21 March 2026 19:14:14 -0400 (0:00:00.202) 0:17:25.693 ******** ok: [managed-node9] => { "changed": false, "failed_when_result": false, "rc": 1 } STDERR: Shared connection to 10.31.10.97 closed. MSG: non-zero return code TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:78 Saturday 21 March 2026 19:14:16 -0400 (0:00:01.599) 0:17:27.293 ******** skipping: [managed-node9] => (item=/dev/sda) => { "ansible_loop_var": "st_pool_pv", "changed": false, "skip_reason": "Conditional result was False", "st_pool_pv": "/dev/sda" } TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:88 Saturday 21 March 2026 19:14:16 -0400 (0:00:00.157) 0:17:27.450 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node9 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Saturday 21 March 2026 19:14:16 -0400 (0:00:00.288) 0:17:27.739 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Saturday 21 March 2026 19:14:17 -0400 (0:00:00.271) 0:17:28.011 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Saturday 21 March 2026 19:14:17 -0400 (0:00:00.181) 0:17:28.192 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Saturday 21 March 2026 19:14:17 -0400 (0:00:00.260) 0:17:28.452 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Saturday 21 March 2026 19:14:17 -0400 (0:00:00.261) 0:17:28.714 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Saturday 21 March 2026 19:14:18 -0400 (0:00:00.221) 0:17:28.936 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Saturday 21 March 2026 19:14:18 -0400 (0:00:00.199) 0:17:29.136 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Saturday 21 March 2026 19:14:18 -0400 (0:00:00.185) 0:17:29.321 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Saturday 21 March 2026 19:14:18 -0400 (0:00:00.178) 0:17:29.500 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Saturday 21 March 2026 19:14:18 -0400 (0:00:00.196) 0:17:29.697 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Saturday 21 March 2026 19:14:19 -0400 (0:00:00.248) 0:17:29.946 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:91 Saturday 21 March 2026 19:14:19 -0400 (0:00:00.240) 0:17:30.186 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node9 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Saturday 21 March 2026 19:14:19 -0400 (0:00:00.461) 0:17:30.648 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml for managed-node9 TASK [Get information about the LV] ******************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:8 Saturday 21 March 2026 19:14:20 -0400 (0:00:00.206) 0:17:30.854 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:16 Saturday 21 March 2026 19:14:20 -0400 (0:00:00.134) 0:17:30.988 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:20 Saturday 21 March 2026 19:14:20 -0400 (0:00:00.165) 0:17:31.154 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV stripe size] ****************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:27 Saturday 21 March 2026 19:14:20 -0400 (0:00:00.159) 0:17:31.313 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested stripe size] ***************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:31 Saturday 21 March 2026 19:14:20 -0400 (0:00:00.137) 0:17:31.451 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected stripe size] ************************************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:37 Saturday 21 March 2026 19:14:20 -0400 (0:00:00.176) 0:17:31.627 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check stripe size] ******************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:42 Saturday 21 March 2026 19:14:20 -0400 (0:00:00.191) 0:17:31.818 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:94 Saturday 21 March 2026 19:14:21 -0400 (0:00:00.196) 0:17:32.015 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node9 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Saturday 21 March 2026 19:14:21 -0400 (0:00:00.489) 0:17:32.504 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml for managed-node9 TASK [Get information about thinpool] ****************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:8 Saturday 21 March 2026 19:14:22 -0400 (0:00:00.423) 0:17:32.928 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in correct thinpool (when thinp name is provided)] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:16 Saturday 21 March 2026 19:14:22 -0400 (0:00:00.179) 0:17:33.107 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in thinpool (when thinp name is not provided)] ****** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:22 Saturday 21 March 2026 19:14:22 -0400 (0:00:00.135) 0:17:33.243 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:26 Saturday 21 March 2026 19:14:22 -0400 (0:00:00.156) 0:17:33.399 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_thin_status": null }, "changed": false } TASK [Check member encryption] ************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:97 Saturday 21 March 2026 19:14:22 -0400 (0:00:00.136) 0:17:33.536 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node9 TASK [Set test variables] ****************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Saturday 21 March 2026 19:14:23 -0400 (0:00:00.423) 0:17:33.959 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Saturday 21 March 2026 19:14:23 -0400 (0:00:00.219) 0:17:34.179 ******** skipping: [managed-node9] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Saturday 21 March 2026 19:14:23 -0400 (0:00:00.307) 0:17:34.487 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml for managed-node9 TASK [Set variables used by tests] ********************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:2 Saturday 21 March 2026 19:14:23 -0400 (0:00:00.348) 0:17:34.835 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:6 Saturday 21 March 2026 19:14:24 -0400 (0:00:00.204) 0:17:35.040 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:14 Saturday 21 March 2026 19:14:24 -0400 (0:00:00.175) 0:17:35.216 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:23 Saturday 21 March 2026 19:14:24 -0400 (0:00:00.194) 0:17:35.411 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:32 Saturday 21 March 2026 19:14:24 -0400 (0:00:00.210) 0:17:35.622 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:41 Saturday 21 March 2026 19:14:25 -0400 (0:00:00.301) 0:17:35.923 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [Clear test variables] **************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Saturday 21 March 2026 19:14:25 -0400 (0:00:00.220) 0:17:36.143 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:100 Saturday 21 March 2026 19:14:25 -0400 (0:00:00.215) 0:17:36.359 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node9 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Saturday 21 March 2026 19:14:26 -0400 (0:00:00.488) 0:17:36.847 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml for managed-node9 TASK [Get information about VDO deduplication] ********************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:8 Saturday 21 March 2026 19:14:26 -0400 (0:00:00.350) 0:17:37.198 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:15 Saturday 21 March 2026 19:14:26 -0400 (0:00:00.279) 0:17:37.478 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:21 Saturday 21 March 2026 19:14:26 -0400 (0:00:00.245) 0:17:37.724 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about VDO compression] *********************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:27 Saturday 21 March 2026 19:14:27 -0400 (0:00:00.255) 0:17:37.979 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off - 2] *********************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:34 Saturday 21 March 2026 19:14:27 -0400 (0:00:00.221) 0:17:38.200 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on - 2] ************************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:40 Saturday 21 March 2026 19:14:27 -0400 (0:00:00.207) 0:17:38.408 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:46 Saturday 21 March 2026 19:14:27 -0400 (0:00:00.201) 0:17:38.609 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:103 Saturday 21 March 2026 19:14:27 -0400 (0:00:00.192) 0:17:38.802 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node9 TASK [Get stratis pool information] ******************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Saturday 21 March 2026 19:14:28 -0400 (0:00:00.462) 0:17:39.265 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print script output] ***************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Saturday 21 March 2026 19:14:28 -0400 (0:00:00.183) 0:17:39.449 ******** skipping: [managed-node9] => {} TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:19 Saturday 21 March 2026 19:14:28 -0400 (0:00:00.143) 0:17:39.592 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:23 Saturday 21 March 2026 19:14:29 -0400 (0:00:00.312) 0:17:39.905 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:30 Saturday 21 March 2026 19:14:29 -0400 (0:00:00.164) 0:17:40.070 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:39 Saturday 21 March 2026 19:14:29 -0400 (0:00:00.154) 0:17:40.224 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:49 Saturday 21 March 2026 19:14:29 -0400 (0:00:00.222) 0:17:40.446 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:106 Saturday 21 March 2026 19:14:29 -0400 (0:00:00.161) 0:17:40.608 ******** ok: [managed-node9] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Saturday 21 March 2026 19:14:29 -0400 (0:00:00.203) 0:17:40.811 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node9 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Saturday 21 March 2026 19:14:30 -0400 (0:00:00.283) 0:17:41.095 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Saturday 21 March 2026 19:14:30 -0400 (0:00:00.289) 0:17:41.385 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node9 included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node9 included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node9 included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node9 included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node9 included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node9 included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node9 included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node9 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Saturday 21 March 2026 19:14:31 -0400 (0:00:01.439) 0:17:42.824 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-f3a44b09-71d6-43a0-aacd-e66622f7a65f" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Saturday 21 March 2026 19:14:32 -0400 (0:00:00.234) 0:17:43.059 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Saturday 21 March 2026 19:14:32 -0400 (0:00:00.313) 0:17:43.372 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Saturday 21 March 2026 19:14:32 -0400 (0:00:00.328) 0:17:43.700 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Saturday 21 March 2026 19:14:33 -0400 (0:00:00.229) 0:17:43.930 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Saturday 21 March 2026 19:14:33 -0400 (0:00:00.216) 0:17:44.146 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Saturday 21 March 2026 19:14:33 -0400 (0:00:00.207) 0:17:44.354 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Saturday 21 March 2026 19:14:33 -0400 (0:00:00.274) 0:17:44.628 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Saturday 21 March 2026 19:14:34 -0400 (0:00:00.251) 0:17:44.880 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Saturday 21 March 2026 19:14:34 -0400 (0:00:00.236) 0:17:45.117 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Saturday 21 March 2026 19:14:34 -0400 (0:00:00.237) 0:17:45.354 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Saturday 21 March 2026 19:14:34 -0400 (0:00:00.131) 0:17:45.486 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-f3a44b09-71d6-43a0-aacd-e66622f7a65f " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Saturday 21 March 2026 19:14:35 -0400 (0:00:00.455) 0:17:45.942 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Saturday 21 March 2026 19:14:35 -0400 (0:00:00.174) 0:17:46.116 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Saturday 21 March 2026 19:14:35 -0400 (0:00:00.144) 0:17:46.261 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Saturday 21 March 2026 19:14:35 -0400 (0:00:00.196) 0:17:46.457 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Saturday 21 March 2026 19:14:35 -0400 (0:00:00.319) 0:17:46.777 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Saturday 21 March 2026 19:14:36 -0400 (0:00:00.296) 0:17:47.073 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Saturday 21 March 2026 19:14:36 -0400 (0:00:00.330) 0:17:47.404 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Saturday 21 March 2026 19:14:36 -0400 (0:00:00.293) 0:17:47.697 ******** ok: [managed-node9] => { "changed": false, "stat": { "atime": 1774134775.501466, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1774134705.2170587, "dev": 6, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 265193, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1774134705.2170587, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Saturday 21 March 2026 19:14:38 -0400 (0:00:01.547) 0:17:49.244 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Saturday 21 March 2026 19:14:38 -0400 (0:00:00.165) 0:17:49.410 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Saturday 21 March 2026 19:14:38 -0400 (0:00:00.234) 0:17:49.644 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Saturday 21 March 2026 19:14:38 -0400 (0:00:00.193) 0:17:49.838 ******** ok: [managed-node9] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Saturday 21 March 2026 19:14:39 -0400 (0:00:00.215) 0:17:50.054 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Saturday 21 March 2026 19:14:39 -0400 (0:00:00.164) 0:17:50.219 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Saturday 21 March 2026 19:14:39 -0400 (0:00:00.174) 0:17:50.394 ******** ok: [managed-node9] => { "changed": false, "stat": { "atime": 1774134830.200783, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1774134705.3660595, "dev": 6, "device_type": 64769, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 266385, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1774134705.3660595, "nlink": 1, "path": "/dev/mapper/luks-f3a44b09-71d6-43a0-aacd-e66622f7a65f", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Saturday 21 March 2026 19:14:40 -0400 (0:00:01.407) 0:17:51.801 ******** ok: [managed-node9] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Saturday 21 March 2026 19:14:44 -0400 (0:00:03.930) 0:17:55.731 ******** ok: [managed-node9] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/mapper/foo-test1" ], "delta": "0:00:00.010421", "end": "2026-03-21 19:14:45.816713", "rc": 0, "start": "2026-03-21 19:14:45.806292" } STDOUT: LUKS header information for /dev/mapper/foo-test1 Version: 1 Cipher name: aes Cipher mode: xts-plain64 Hash spec: sha256 Payload offset: 16384 MK bits: 512 MK digest: 14 9b 43 bd 00 fb 8f 3d 95 ee 53 23 cc 53 b9 f5 79 03 16 a4 MK salt: 7a a3 c7 53 5a 94 e2 50 84 d7 36 38 4e 79 44 7c ac 9d 50 38 0a 7f 2c 89 75 5b 28 a6 a1 26 9e 58 MK iterations: 120470 UUID: f3a44b09-71d6-43a0-aacd-e66622f7a65f Key Slot 0: ENABLED Iterations: 1923992 Salt: d4 94 a1 a8 58 a7 20 24 93 b8 63 f3 81 1e c4 66 10 30 31 89 b7 ac 4d 96 29 06 75 a9 d9 2b ae 84 Key material offset: 8 AF stripes: 4000 Key Slot 1: DISABLED Key Slot 2: DISABLED Key Slot 3: DISABLED Key Slot 4: DISABLED Key Slot 5: DISABLED Key Slot 6: DISABLED Key Slot 7: DISABLED TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Saturday 21 March 2026 19:14:46 -0400 (0:00:01.113) 0:17:56.845 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Saturday 21 March 2026 19:14:46 -0400 (0:00:00.252) 0:17:57.097 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Saturday 21 March 2026 19:14:46 -0400 (0:00:00.281) 0:17:57.379 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Saturday 21 March 2026 19:14:46 -0400 (0:00:00.211) 0:17:57.591 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Saturday 21 March 2026 19:14:47 -0400 (0:00:00.275) 0:17:57.867 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Saturday 21 March 2026 19:14:47 -0400 (0:00:00.318) 0:17:58.186 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Saturday 21 March 2026 19:14:47 -0400 (0:00:00.189) 0:17:58.375 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Saturday 21 March 2026 19:14:47 -0400 (0:00:00.244) 0:17:58.620 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-f3a44b09-71d6-43a0-aacd-e66622f7a65f /dev/mapper/foo-test1 -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Saturday 21 March 2026 19:14:48 -0400 (0:00:00.291) 0:17:58.912 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Saturday 21 March 2026 19:14:48 -0400 (0:00:00.225) 0:17:59.137 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Saturday 21 March 2026 19:14:48 -0400 (0:00:00.296) 0:17:59.434 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Saturday 21 March 2026 19:14:48 -0400 (0:00:00.290) 0:17:59.725 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Saturday 21 March 2026 19:14:49 -0400 (0:00:00.347) 0:18:00.072 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Saturday 21 March 2026 19:14:49 -0400 (0:00:00.155) 0:18:00.227 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Saturday 21 March 2026 19:14:49 -0400 (0:00:00.212) 0:18:00.440 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Saturday 21 March 2026 19:14:49 -0400 (0:00:00.228) 0:18:00.669 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Saturday 21 March 2026 19:14:49 -0400 (0:00:00.102) 0:18:00.772 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Saturday 21 March 2026 19:14:50 -0400 (0:00:00.170) 0:18:00.942 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Saturday 21 March 2026 19:14:50 -0400 (0:00:00.221) 0:18:01.164 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Saturday 21 March 2026 19:14:50 -0400 (0:00:00.175) 0:18:01.339 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Saturday 21 March 2026 19:14:50 -0400 (0:00:00.170) 0:18:01.510 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Saturday 21 March 2026 19:14:50 -0400 (0:00:00.193) 0:18:01.704 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Saturday 21 March 2026 19:14:51 -0400 (0:00:00.188) 0:18:01.893 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Saturday 21 March 2026 19:14:51 -0400 (0:00:00.103) 0:18:01.996 ******** ok: [managed-node9] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Saturday 21 March 2026 19:14:52 -0400 (0:00:01.363) 0:18:03.359 ******** ok: [managed-node9] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Saturday 21 March 2026 19:14:53 -0400 (0:00:01.424) 0:18:04.784 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [Show expected size] ****************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Saturday 21 March 2026 19:14:54 -0400 (0:00:00.273) 0:18:05.058 ******** ok: [managed-node9] => { "storage_test_expected_size": "4294967296" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Saturday 21 March 2026 19:14:54 -0400 (0:00:00.224) 0:18:05.282 ******** ok: [managed-node9] => { "bytes": 10726680821, "changed": false, "lvm": "9g", "parted": "9GiB", "size": "9 GiB" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Saturday 21 March 2026 19:14:55 -0400 (0:00:01.396) 0:18:06.679 ******** skipping: [managed-node9] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Saturday 21 March 2026 19:14:56 -0400 (0:00:00.183) 0:18:06.862 ******** skipping: [managed-node9] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Saturday 21 March 2026 19:14:56 -0400 (0:00:00.208) 0:18:07.071 ******** skipping: [managed-node9] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Saturday 21 March 2026 19:14:56 -0400 (0:00:00.183) 0:18:07.254 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Saturday 21 March 2026 19:14:56 -0400 (0:00:00.156) 0:18:07.411 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Saturday 21 March 2026 19:14:56 -0400 (0:00:00.091) 0:18:07.502 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Saturday 21 March 2026 19:14:56 -0400 (0:00:00.172) 0:18:07.675 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Saturday 21 March 2026 19:14:56 -0400 (0:00:00.102) 0:18:07.778 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Saturday 21 March 2026 19:14:57 -0400 (0:00:00.197) 0:18:07.975 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Saturday 21 March 2026 19:14:57 -0400 (0:00:00.185) 0:18:08.161 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Saturday 21 March 2026 19:14:57 -0400 (0:00:00.159) 0:18:08.320 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Saturday 21 March 2026 19:14:57 -0400 (0:00:00.264) 0:18:08.585 ******** skipping: [managed-node9] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Saturday 21 March 2026 19:14:57 -0400 (0:00:00.251) 0:18:08.836 ******** skipping: [managed-node9] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Saturday 21 March 2026 19:14:58 -0400 (0:00:00.229) 0:18:09.065 ******** skipping: [managed-node9] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Saturday 21 March 2026 19:14:58 -0400 (0:00:00.226) 0:18:09.292 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Saturday 21 March 2026 19:14:58 -0400 (0:00:00.214) 0:18:09.506 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Saturday 21 March 2026 19:14:58 -0400 (0:00:00.210) 0:18:09.717 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Saturday 21 March 2026 19:14:59 -0400 (0:00:00.342) 0:18:10.059 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Saturday 21 March 2026 19:14:59 -0400 (0:00:00.210) 0:18:10.270 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Saturday 21 March 2026 19:14:59 -0400 (0:00:00.365) 0:18:10.636 ******** ok: [managed-node9] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Saturday 21 March 2026 19:15:00 -0400 (0:00:00.237) 0:18:10.873 ******** ok: [managed-node9] => { "storage_test_expected_size": "4294967296" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Saturday 21 March 2026 19:15:00 -0400 (0:00:00.178) 0:18:11.052 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Saturday 21 March 2026 19:15:00 -0400 (0:00:00.417) 0:18:11.470 ******** ok: [managed-node9] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.023921", "end": "2026-03-21 19:15:01.786615", "rc": 0, "start": "2026-03-21 19:15:01.762694" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Saturday 21 March 2026 19:15:02 -0400 (0:00:01.408) 0:18:12.878 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [Check segment type] ****************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Saturday 21 March 2026 19:15:02 -0400 (0:00:00.147) 0:18:13.026 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Saturday 21 March 2026 19:15:02 -0400 (0:00:00.300) 0:18:13.326 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Saturday 21 March 2026 19:15:02 -0400 (0:00:00.209) 0:18:13.535 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Saturday 21 March 2026 19:15:02 -0400 (0:00:00.242) 0:18:13.778 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Saturday 21 March 2026 19:15:03 -0400 (0:00:00.337) 0:18:14.116 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Saturday 21 March 2026 19:15:03 -0400 (0:00:00.145) 0:18:14.261 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Saturday 21 March 2026 19:15:03 -0400 (0:00:00.128) 0:18:14.390 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Saturday 21 March 2026 19:15:03 -0400 (0:00:00.110) 0:18:14.501 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Saturday 21 March 2026 19:15:03 -0400 (0:00:00.149) 0:18:14.650 ******** changed: [managed-node9] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode - 5] ****************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:417 Saturday 21 March 2026 19:15:04 -0400 (0:00:01.073) 0:18:15.723 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node9 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Saturday 21 March 2026 19:15:05 -0400 (0:00:00.357) 0:18:16.081 ******** ok: [managed-node9] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Saturday 21 March 2026 19:15:05 -0400 (0:00:00.159) 0:18:16.240 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node9 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:24 Saturday 21 March 2026 19:15:05 -0400 (0:00:00.248) 0:18:16.488 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:34 Saturday 21 March 2026 19:15:05 -0400 (0:00:00.189) 0:18:16.678 ******** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Saturday 21 March 2026 19:15:06 -0400 (0:00:00.243) 0:18:16.921 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Saturday 21 March 2026 19:15:06 -0400 (0:00:00.245) 0:18:17.166 ******** ok: [managed-node9] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Saturday 21 March 2026 19:15:08 -0400 (0:00:01.812) 0:18:18.979 ******** skipping: [managed-node9] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node9] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node9] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node9] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Saturday 21 March 2026 19:15:08 -0400 (0:00:00.264) 0:18:19.244 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Saturday 21 March 2026 19:15:08 -0400 (0:00:00.175) 0:18:19.420 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Saturday 21 March 2026 19:15:08 -0400 (0:00:00.170) 0:18:19.591 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Saturday 21 March 2026 19:15:08 -0400 (0:00:00.161) 0:18:19.753 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Saturday 21 March 2026 19:15:09 -0400 (0:00:00.167) 0:18:19.920 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Saturday 21 March 2026 19:15:09 -0400 (0:00:00.488) 0:18:20.409 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Saturday 21 March 2026 19:15:09 -0400 (0:00:00.242) 0:18:20.651 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Saturday 21 March 2026 19:15:10 -0400 (0:00:00.224) 0:18:20.875 ******** ok: [managed-node9] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kmod-kvdo libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python3-blivet stratis-cli stratisd vdo xfsprogs TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Saturday 21 March 2026 19:15:14 -0400 (0:00:04.135) 0:18:25.011 ******** ok: [managed-node9] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": false, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Saturday 21 March 2026 19:15:14 -0400 (0:00:00.195) 0:18:25.206 ******** ok: [managed-node9] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Saturday 21 March 2026 19:15:14 -0400 (0:00:00.258) 0:18:25.465 ******** ok: [managed-node9] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Saturday 21 March 2026 19:15:20 -0400 (0:00:05.675) 0:18:31.140 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Saturday 21 March 2026 19:15:20 -0400 (0:00:00.342) 0:18:31.483 ******** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Saturday 21 March 2026 19:15:20 -0400 (0:00:00.184) 0:18:31.667 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Saturday 21 March 2026 19:15:21 -0400 (0:00:00.269) 0:18:31.937 ******** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Saturday 21 March 2026 19:15:21 -0400 (0:00:00.203) 0:18:32.140 ******** ok: [managed-node9] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kpartx lvm2 TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Saturday 21 March 2026 19:15:25 -0400 (0:00:03.724) 0:18:35.864 ******** ok: [managed-node9] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luk...d71d6\\x2d43a0\\x2daacd\\x2de66622f7a65f.service": { "name": "systemd-cryptsetup@luk...d71d6\\x2d43a0\\x2daacd\\x2de66622f7a65f.service", "source": "systemd", "state": "inactive", "status": "generated" }, "systemd-cryptsetup@luks\\x2df3a44b09\\x2d71d6\\x2d43a0\\x2daacd\\x2de66622f7a65f.service": { "name": "systemd-cryptsetup@luks\\x2df3a44b09\\x2d71d6\\x2d43a0\\x2daacd\\x2de66622f7a65f.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Saturday 21 March 2026 19:15:28 -0400 (0:00:03.031) 0:18:38.896 ******** changed: [managed-node9] => (item=systemd-cryptsetup@luks\x2df3a44b09\x2d71d6\x2d43a0\x2daacd\x2de66622f7a65f.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2df3a44b09\\x2d71d6\\x2d43a0\\x2daacd\\x2de66622f7a65f.service", "name": "systemd-cryptsetup@luks\\x2df3a44b09\\x2d71d6\\x2d43a0\\x2daacd\\x2de66622f7a65f.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "dev-mapper-foo\\x2dtest1.device system-systemd\\x2dcryptsetup.slice cryptsetup-pre.target systemd-journald.socket", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "cryptsetup.target umount.target", "BindsTo": "dev-mapper-foo\\x2dtest1.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-f3a44b09-71d6-43a0-aacd-e66622f7a65f", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-f3a44b09-71d6-43a0-aacd-e66622f7a65f /dev/mapper/foo-test1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-f3a44b09-71d6-43a0-aacd-e66622f7a65f ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2df3a44b09\\x2d71d6\\x2d43a0\\x2daacd\\x2de66622f7a65f.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2df3a44b09\\x2d71d6\\x2d43a0\\x2daacd\\x2de66622f7a65f.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2df3a44b09\\x2d71d6\\x2d43a0\\x2daacd\\x2de66622f7a65f.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target dev-mapper-luks\\x2df3a44b09\\x2d71d6\\x2d43a0\\x2daacd\\x2de66622f7a65f.device", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Sat 2026-03-21 19:13:44 EDT", "StateChangeTimestampMonotonic": "2587392610", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "infinity", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "WantedBy": "dev-mapper-foo\\x2dtest1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node9] => (item=systemd-cryptsetup@luk...d71d6\x2d43a0\x2daacd\x2de66622f7a65f.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...d71d6\\x2d43a0\\x2daacd\\x2de66622f7a65f.service", "name": "systemd-cryptsetup@luk...d71d6\\x2d43a0\\x2daacd\\x2de66622f7a65f.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...d71d6\\x2d43a0\\x2daacd\\x2de66622f7a65f.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...d71d6\\x2d43a0\\x2daacd\\x2de66622f7a65f.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.NoSuchUnit \"Unit systemd-cryptsetup@luk...d71d6\\x2d43a0\\x2daacd\\x2de66622f7a65f.service not found.\"", "LoadState": "not-found", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...d71d6\\x2d43a0\\x2daacd\\x2de66622f7a65f.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Saturday 21 March 2026 19:15:31 -0400 (0:00:03.780) 0:18:42.677 ******** fatal: [managed-node9]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'luks-f3a44b09-71d6-43a0-aacd-e66622f7a65f' in safe mode due to encryption removal TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129 Saturday 21 March 2026 19:15:36 -0400 (0:00:05.144) 0:18:47.821 ******** fatal: [managed-node9]: FAILED! => { "changed": false } MSG: {'msg': "cannot remove existing formatting on device 'luks-f3a44b09-71d6-43a0-aacd-e66622f7a65f' in safe mode due to encryption removal", 'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'invocation': {'module_args': {'pools': [{'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'lvm', 'volumes': [{'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks1', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None}]}], 'volumes': [], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': True, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Saturday 21 March 2026 19:15:37 -0400 (0:00:00.294) 0:18:48.115 ******** changed: [managed-node9] => (item=systemd-cryptsetup@luks\x2df3a44b09\x2d71d6\x2d43a0\x2daacd\x2de66622f7a65f.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2df3a44b09\\x2d71d6\\x2d43a0\\x2daacd\\x2de66622f7a65f.service", "name": "systemd-cryptsetup@luks\\x2df3a44b09\\x2d71d6\\x2d43a0\\x2daacd\\x2de66622f7a65f.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2df3a44b09\\x2d71d6\\x2d43a0\\x2daacd\\x2de66622f7a65f.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2df3a44b09\\x2d71d6\\x2d43a0\\x2daacd\\x2de66622f7a65f.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2df3a44b09\\x2d71d6\\x2d43a0\\x2daacd\\x2de66622f7a65f.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2df3a44b09\\x2d71d6\\x2d43a0\\x2daacd\\x2de66622f7a65f.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target dev-mapper-luks\\x2df3a44b09\\x2d71d6\\x2d43a0\\x2daacd\\x2de66622f7a65f.device", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Sat 2026-03-21 19:13:44 EDT", "StateChangeTimestampMonotonic": "2587392610", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WantedBy": "dev-mapper-foo\\x2dtest1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node9] => (item=systemd-cryptsetup@luk...d71d6\x2d43a0\x2daacd\x2de66622f7a65f.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...d71d6\\x2d43a0\\x2daacd\\x2de66622f7a65f.service", "name": "systemd-cryptsetup@luk...d71d6\\x2d43a0\\x2daacd\\x2de66622f7a65f.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...d71d6\\x2d43a0\\x2daacd\\x2de66622f7a65f.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...d71d6\\x2d43a0\\x2daacd\\x2de66622f7a65f.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luk...d71d6\\x2d43a0\\x2daacd\\x2de66622f7a65f.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...d71d6\\x2d43a0\\x2daacd\\x2de66622f7a65f.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:22 Saturday 21 March 2026 19:15:40 -0400 (0:00:03.270) 0:18:51.385 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:27 Saturday 21 March 2026 19:15:40 -0400 (0:00:00.173) 0:18:51.559 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:38 Saturday 21 March 2026 19:15:41 -0400 (0:00:00.342) 0:18:51.901 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Saturday 21 March 2026 19:15:41 -0400 (0:00:00.286) 0:18:52.188 ******** ok: [managed-node9] => { "changed": false, "stat": { "atime": 1774134904.622214, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1774134904.622214, "dev": 64769, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1774134904.622214, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "1132058778", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Saturday 21 March 2026 19:15:42 -0400 (0:00:01.648) 0:18:53.837 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Remove the encryption layer - 3] ***************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:440 Saturday 21 March 2026 19:15:43 -0400 (0:00:00.263) 0:18:54.100 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node9 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:24 Saturday 21 March 2026 19:15:43 -0400 (0:00:00.399) 0:18:54.500 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:34 Saturday 21 March 2026 19:15:43 -0400 (0:00:00.225) 0:18:54.725 ******** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Saturday 21 March 2026 19:15:44 -0400 (0:00:00.209) 0:18:54.934 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Saturday 21 March 2026 19:15:44 -0400 (0:00:00.209) 0:18:55.143 ******** ok: [managed-node9] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Saturday 21 March 2026 19:15:46 -0400 (0:00:02.391) 0:18:57.535 ******** skipping: [managed-node9] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node9] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node9] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node9] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Saturday 21 March 2026 19:15:47 -0400 (0:00:00.459) 0:18:57.994 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Saturday 21 March 2026 19:15:47 -0400 (0:00:00.252) 0:18:58.247 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Saturday 21 March 2026 19:15:47 -0400 (0:00:00.482) 0:18:58.729 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Saturday 21 March 2026 19:15:48 -0400 (0:00:00.189) 0:18:58.919 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Saturday 21 March 2026 19:15:48 -0400 (0:00:00.188) 0:18:59.108 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Saturday 21 March 2026 19:15:48 -0400 (0:00:00.461) 0:18:59.569 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Saturday 21 March 2026 19:15:48 -0400 (0:00:00.266) 0:18:59.836 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Saturday 21 March 2026 19:15:49 -0400 (0:00:00.262) 0:19:00.098 ******** ok: [managed-node9] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kmod-kvdo libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python3-blivet stratis-cli stratisd vdo xfsprogs TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Saturday 21 March 2026 19:15:53 -0400 (0:00:04.361) 0:19:04.460 ******** ok: [managed-node9] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": false, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Saturday 21 March 2026 19:15:53 -0400 (0:00:00.155) 0:19:04.616 ******** ok: [managed-node9] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Saturday 21 March 2026 19:15:53 -0400 (0:00:00.123) 0:19:04.740 ******** ok: [managed-node9] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Saturday 21 March 2026 19:15:59 -0400 (0:00:05.373) 0:19:10.113 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Saturday 21 March 2026 19:15:59 -0400 (0:00:00.124) 0:19:10.238 ******** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Saturday 21 March 2026 19:15:59 -0400 (0:00:00.083) 0:19:10.321 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Saturday 21 March 2026 19:15:59 -0400 (0:00:00.117) 0:19:10.438 ******** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Saturday 21 March 2026 19:15:59 -0400 (0:00:00.181) 0:19:10.620 ******** ok: [managed-node9] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kpartx lvm2 TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Saturday 21 March 2026 19:16:03 -0400 (0:00:03.779) 0:19:14.400 ******** ok: [managed-node9] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luk...d71d6\\x2d43a0\\x2daacd\\x2de66622f7a65f.service": { "name": "systemd-cryptsetup@luk...d71d6\\x2d43a0\\x2daacd\\x2de66622f7a65f.service", "source": "systemd", "state": "inactive", "status": "generated" }, "systemd-cryptsetup@luks\\x2df3a44b09\\x2d71d6\\x2d43a0\\x2daacd\\x2de66622f7a65f.service": { "name": "systemd-cryptsetup@luks\\x2df3a44b09\\x2d71d6\\x2d43a0\\x2daacd\\x2de66622f7a65f.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Saturday 21 March 2026 19:16:06 -0400 (0:00:02.920) 0:19:17.321 ******** changed: [managed-node9] => (item=systemd-cryptsetup@luks\x2df3a44b09\x2d71d6\x2d43a0\x2daacd\x2de66622f7a65f.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2df3a44b09\\x2d71d6\\x2d43a0\\x2daacd\\x2de66622f7a65f.service", "name": "systemd-cryptsetup@luks\\x2df3a44b09\\x2d71d6\\x2d43a0\\x2daacd\\x2de66622f7a65f.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "system-systemd\\x2dcryptsetup.slice systemd-journald.socket cryptsetup-pre.target dev-mapper-foo\\x2dtest1.device", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "cryptsetup.target umount.target", "BindsTo": "dev-mapper-foo\\x2dtest1.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-f3a44b09-71d6-43a0-aacd-e66622f7a65f", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-f3a44b09-71d6-43a0-aacd-e66622f7a65f /dev/mapper/foo-test1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-f3a44b09-71d6-43a0-aacd-e66622f7a65f ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2df3a44b09\\x2d71d6\\x2d43a0\\x2daacd\\x2de66622f7a65f.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2df3a44b09\\x2d71d6\\x2d43a0\\x2daacd\\x2de66622f7a65f.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2df3a44b09\\x2d71d6\\x2d43a0\\x2daacd\\x2de66622f7a65f.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target dev-mapper-luks\\x2df3a44b09\\x2d71d6\\x2d43a0\\x2daacd\\x2de66622f7a65f.device", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Sat 2026-03-21 19:13:44 EDT", "StateChangeTimestampMonotonic": "2587392610", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "infinity", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "WantedBy": "dev-mapper-foo\\x2dtest1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node9] => (item=systemd-cryptsetup@luk...d71d6\x2d43a0\x2daacd\x2de66622f7a65f.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...d71d6\\x2d43a0\\x2daacd\\x2de66622f7a65f.service", "name": "systemd-cryptsetup@luk...d71d6\\x2d43a0\\x2daacd\\x2de66622f7a65f.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...d71d6\\x2d43a0\\x2daacd\\x2de66622f7a65f.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...d71d6\\x2d43a0\\x2daacd\\x2de66622f7a65f.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.NoSuchUnit \"Unit systemd-cryptsetup@luk...d71d6\\x2d43a0\\x2daacd\\x2de66622f7a65f.service not found.\"", "LoadState": "not-found", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...d71d6\\x2d43a0\\x2daacd\\x2de66622f7a65f.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Saturday 21 March 2026 19:16:09 -0400 (0:00:03.034) 0:19:20.355 ******** changed: [managed-node9] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-f3a44b09-71d6-43a0-aacd-e66622f7a65f", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-f3a44b09-71d6-43a0-aacd-e66622f7a65f", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-f3a44b09-71d6-43a0-aacd-e66622f7a65f", "password": "-", "state": "absent" } ], "leaves": [ "/dev/mapper/foo-test1", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-f3a44b09-71d6-43a0-aacd-e66622f7a65f", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "xfsprogs", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:103 Saturday 21 March 2026 19:16:15 -0400 (0:00:05.709) 0:19:26.065 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:110 Saturday 21 March 2026 19:16:15 -0400 (0:00:00.174) 0:19:26.239 ******** ok: [managed-node9] => { "changed": false, "stat": { "atime": 1774134714.0721102, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "5b0e0b19bd21f83e98347073085dda97746c1758", "ctime": 1774134714.0691102, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 364904581, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1774134714.0691102, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1425, "uid": 0, "version": "133446622", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Saturday 21 March 2026 19:16:16 -0400 (0:00:01.348) 0:19:27.588 ******** ok: [managed-node9] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Saturday 21 March 2026 19:16:18 -0400 (0:00:01.342) 0:19:28.930 ******** changed: [managed-node9] => (item=systemd-cryptsetup@luks\x2df3a44b09\x2d71d6\x2d43a0\x2daacd\x2de66622f7a65f.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2df3a44b09\\x2d71d6\\x2d43a0\\x2daacd\\x2de66622f7a65f.service", "name": "systemd-cryptsetup@luks\\x2df3a44b09\\x2d71d6\\x2d43a0\\x2daacd\\x2de66622f7a65f.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2df3a44b09\\x2d71d6\\x2d43a0\\x2daacd\\x2de66622f7a65f.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2df3a44b09\\x2d71d6\\x2d43a0\\x2daacd\\x2de66622f7a65f.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2df3a44b09\\x2d71d6\\x2d43a0\\x2daacd\\x2de66622f7a65f.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2df3a44b09\\x2d71d6\\x2d43a0\\x2daacd\\x2de66622f7a65f.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target dev-mapper-luks\\x2df3a44b09\\x2d71d6\\x2d43a0\\x2daacd\\x2de66622f7a65f.device", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Sat 2026-03-21 19:13:44 EDT", "StateChangeTimestampMonotonic": "2587392610", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WantedBy": "dev-mapper-foo\\x2dtest1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node9] => (item=systemd-cryptsetup@luk...d71d6\x2d43a0\x2daacd\x2de66622f7a65f.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...d71d6\\x2d43a0\\x2daacd\\x2de66622f7a65f.service", "name": "systemd-cryptsetup@luk...d71d6\\x2d43a0\\x2daacd\\x2de66622f7a65f.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...d71d6\\x2d43a0\\x2daacd\\x2de66622f7a65f.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...d71d6\\x2d43a0\\x2daacd\\x2de66622f7a65f.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luk...d71d6\\x2d43a0\\x2daacd\\x2de66622f7a65f.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...d71d6\\x2d43a0\\x2daacd\\x2de66622f7a65f.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:139 Saturday 21 March 2026 19:16:21 -0400 (0:00:03.488) 0:19:32.418 ******** ok: [managed-node9] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-f3a44b09-71d6-43a0-aacd-e66622f7a65f", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-f3a44b09-71d6-43a0-aacd-e66622f7a65f", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-f3a44b09-71d6-43a0-aacd-e66622f7a65f", "password": "-", "state": "absent" } ], "failed": false, "leaves": [ "/dev/mapper/foo-test1", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-f3a44b09-71d6-43a0-aacd-e66622f7a65f", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "xfsprogs", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Saturday 21 March 2026 19:16:21 -0400 (0:00:00.240) 0:19:32.659 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:152 Saturday 21 March 2026 19:16:22 -0400 (0:00:00.204) 0:19:32.863 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:168 Saturday 21 March 2026 19:16:22 -0400 (0:00:00.224) 0:19:33.088 ******** changed: [managed-node9] => (item={'src': '/dev/mapper/luks-f3a44b09-71d6-43a0-aacd-e66622f7a65f', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-f3a44b09-71d6-43a0-aacd-e66622f7a65f", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-f3a44b09-71d6-43a0-aacd-e66622f7a65f" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:179 Saturday 21 March 2026 19:16:23 -0400 (0:00:01.640) 0:19:34.728 ******** ok: [managed-node9] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:184 Saturday 21 March 2026 19:16:25 -0400 (0:00:01.737) 0:19:36.465 ******** changed: [managed-node9] => (item={'src': '/dev/mapper/foo-test1', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Saturday 21 March 2026 19:16:27 -0400 (0:00:01.631) 0:19:38.097 ******** skipping: [managed-node9] => (item={'src': '/dev/mapper/foo-test1', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:207 Saturday 21 March 2026 19:16:27 -0400 (0:00:00.317) 0:19:38.415 ******** ok: [managed-node9] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:215 Saturday 21 March 2026 19:16:29 -0400 (0:00:01.757) 0:19:40.172 ******** ok: [managed-node9] => { "changed": false, "stat": { "atime": 1774134728.725195, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "dabe5e473594e927685ea32d770f243b4f18772e", "ctime": 1774134721.1231508, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 39846086, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1774134721.122151, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 66, "uid": 0, "version": "3488844892", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:220 Saturday 21 March 2026 19:16:30 -0400 (0:00:01.574) 0:19:41.747 ******** changed: [managed-node9] => (item={'backing_device': '/dev/mapper/foo-test1', 'name': 'luks-f3a44b09-71d6-43a0-aacd-e66622f7a65f', 'password': '-', 'state': 'absent'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/mapper/foo-test1", "name": "luks-f3a44b09-71d6-43a0-aacd-e66622f7a65f", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:242 Saturday 21 March 2026 19:16:32 -0400 (0:00:01.686) 0:19:43.433 ******** ok: [managed-node9] TASK [Verify role results - 9] ************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:455 Saturday 21 March 2026 19:16:34 -0400 (0:00:02.167) 0:19:45.601 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node9 TASK [Print out pool information] ********************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Saturday 21 March 2026 19:16:35 -0400 (0:00:00.351) 0:19:45.952 ******** ok: [managed-node9] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Saturday 21 March 2026 19:16:35 -0400 (0:00:00.324) 0:19:46.277 ******** skipping: [managed-node9] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Saturday 21 March 2026 19:16:35 -0400 (0:00:00.287) 0:19:46.565 ******** ok: [managed-node9] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "b4598b4c-6939-4ad3-9956-89d892834d74" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "FV4NCV-y18g-hE3Q-w15a-CQO8-jg2d-pQkOwv" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Saturday 21 March 2026 19:16:37 -0400 (0:00:01.765) 0:19:48.331 ******** ok: [managed-node9] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003045", "end": "2026-03-21 19:16:39.017075", "rc": 0, "start": "2026-03-21 19:16:39.014030" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/foo-test1 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Saturday 21 March 2026 19:16:39 -0400 (0:00:01.837) 0:19:50.168 ******** ok: [managed-node9] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003066", "end": "2026-03-21 19:16:40.630931", "failed_when_result": false, "rc": 0, "start": "2026-03-21 19:16:40.627865" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Saturday 21 March 2026 19:16:40 -0400 (0:00:01.561) 0:19:51.730 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node9 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Saturday 21 March 2026 19:16:41 -0400 (0:00:00.542) 0:19:52.272 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Saturday 21 March 2026 19:16:41 -0400 (0:00:00.169) 0:19:52.441 ******** ok: [managed-node9] => { "changed": false, "cmd": [ "vgs", "--noheadings", "--binary", "-o", "shared", "foo" ], "delta": "0:00:00.027068", "end": "2026-03-21 19:16:43.124974", "rc": 0, "start": "2026-03-21 19:16:43.097906" } STDOUT: 0 TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Saturday 21 March 2026 19:16:43 -0400 (0:00:01.791) 0:19:54.232 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Saturday 21 March 2026 19:16:43 -0400 (0:00:00.239) 0:19:54.472 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node9 included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node9 TASK [Set test variables] ****************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Saturday 21 March 2026 19:16:44 -0400 (0:00:00.585) 0:19:55.057 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Saturday 21 March 2026 19:16:44 -0400 (0:00:00.302) 0:19:55.359 ******** ok: [managed-node9] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Saturday 21 March 2026 19:16:46 -0400 (0:00:02.278) 0:19:57.637 ******** ok: [managed-node9] => { "ansible_facts": { "__pvs_lvm_len": "1" }, "changed": false } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Saturday 21 March 2026 19:16:47 -0400 (0:00:00.241) 0:19:57.878 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "changed": false } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Saturday 21 March 2026 19:16:47 -0400 (0:00:00.219) 0:19:58.097 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Set expected pv type] **************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Saturday 21 March 2026 19:16:47 -0400 (0:00:00.296) 0:19:58.394 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type - 2] ************************************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Saturday 21 March 2026 19:16:47 -0400 (0:00:00.343) 0:19:58.737 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type - 3] ************************************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Saturday 21 March 2026 19:16:48 -0400 (0:00:00.335) 0:19:59.073 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:55 Saturday 21 March 2026 19:16:48 -0400 (0:00:00.302) 0:19:59.375 ******** ok: [managed-node9] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:68 Saturday 21 March 2026 19:16:48 -0400 (0:00:00.364) 0:19:59.740 ******** ok: [managed-node9] => { "changed": false, "failed_when_result": false, "rc": 1 } STDERR: Shared connection to 10.31.10.97 closed. MSG: non-zero return code TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:78 Saturday 21 March 2026 19:16:50 -0400 (0:00:01.980) 0:20:01.720 ******** skipping: [managed-node9] => (item=/dev/sda) => { "ansible_loop_var": "st_pool_pv", "changed": false, "skip_reason": "Conditional result was False", "st_pool_pv": "/dev/sda" } TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:88 Saturday 21 March 2026 19:16:51 -0400 (0:00:00.211) 0:20:01.932 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node9 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Saturday 21 March 2026 19:16:51 -0400 (0:00:00.531) 0:20:02.464 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Saturday 21 March 2026 19:16:51 -0400 (0:00:00.203) 0:20:02.668 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Saturday 21 March 2026 19:16:52 -0400 (0:00:00.315) 0:20:02.984 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Saturday 21 March 2026 19:16:52 -0400 (0:00:00.369) 0:20:03.353 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Saturday 21 March 2026 19:16:52 -0400 (0:00:00.286) 0:20:03.639 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Saturday 21 March 2026 19:16:53 -0400 (0:00:00.392) 0:20:04.032 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Saturday 21 March 2026 19:16:53 -0400 (0:00:00.332) 0:20:04.365 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Saturday 21 March 2026 19:16:53 -0400 (0:00:00.297) 0:20:04.663 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Saturday 21 March 2026 19:16:54 -0400 (0:00:00.233) 0:20:04.896 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Saturday 21 March 2026 19:16:54 -0400 (0:00:00.259) 0:20:05.155 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Saturday 21 March 2026 19:16:54 -0400 (0:00:00.335) 0:20:05.490 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:91 Saturday 21 March 2026 19:16:54 -0400 (0:00:00.249) 0:20:05.740 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node9 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Saturday 21 March 2026 19:16:55 -0400 (0:00:00.473) 0:20:06.213 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml for managed-node9 TASK [Get information about the LV] ******************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:8 Saturday 21 March 2026 19:16:55 -0400 (0:00:00.433) 0:20:06.646 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:16 Saturday 21 March 2026 19:16:56 -0400 (0:00:00.244) 0:20:06.890 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:20 Saturday 21 March 2026 19:16:56 -0400 (0:00:00.269) 0:20:07.160 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV stripe size] ****************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:27 Saturday 21 March 2026 19:16:56 -0400 (0:00:00.241) 0:20:07.401 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested stripe size] ***************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:31 Saturday 21 March 2026 19:16:56 -0400 (0:00:00.307) 0:20:07.708 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected stripe size] ************************************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:37 Saturday 21 March 2026 19:16:57 -0400 (0:00:00.296) 0:20:08.005 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check stripe size] ******************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:42 Saturday 21 March 2026 19:16:57 -0400 (0:00:00.422) 0:20:08.427 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:94 Saturday 21 March 2026 19:16:57 -0400 (0:00:00.290) 0:20:08.718 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node9 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Saturday 21 March 2026 19:16:58 -0400 (0:00:00.413) 0:20:09.131 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml for managed-node9 TASK [Get information about thinpool] ****************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:8 Saturday 21 March 2026 19:16:58 -0400 (0:00:00.409) 0:20:09.541 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in correct thinpool (when thinp name is provided)] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:16 Saturday 21 March 2026 19:16:58 -0400 (0:00:00.282) 0:20:09.824 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in thinpool (when thinp name is not provided)] ****** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:22 Saturday 21 March 2026 19:16:59 -0400 (0:00:00.304) 0:20:10.128 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:26 Saturday 21 March 2026 19:16:59 -0400 (0:00:00.276) 0:20:10.405 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_thin_status": null }, "changed": false } TASK [Check member encryption] ************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:97 Saturday 21 March 2026 19:16:59 -0400 (0:00:00.241) 0:20:10.646 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node9 TASK [Set test variables] ****************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Saturday 21 March 2026 19:17:00 -0400 (0:00:00.638) 0:20:11.284 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Saturday 21 March 2026 19:17:00 -0400 (0:00:00.207) 0:20:11.492 ******** skipping: [managed-node9] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Saturday 21 March 2026 19:17:01 -0400 (0:00:00.419) 0:20:11.912 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml for managed-node9 TASK [Set variables used by tests] ********************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:2 Saturday 21 March 2026 19:17:01 -0400 (0:00:00.390) 0:20:12.302 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:6 Saturday 21 March 2026 19:17:01 -0400 (0:00:00.412) 0:20:12.715 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:14 Saturday 21 March 2026 19:17:02 -0400 (0:00:00.284) 0:20:12.999 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:23 Saturday 21 March 2026 19:17:02 -0400 (0:00:00.250) 0:20:13.250 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:32 Saturday 21 March 2026 19:17:02 -0400 (0:00:00.173) 0:20:13.423 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:41 Saturday 21 March 2026 19:17:02 -0400 (0:00:00.259) 0:20:13.682 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [Clear test variables] **************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Saturday 21 March 2026 19:17:03 -0400 (0:00:00.204) 0:20:13.887 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:100 Saturday 21 March 2026 19:17:03 -0400 (0:00:00.200) 0:20:14.088 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node9 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Saturday 21 March 2026 19:17:03 -0400 (0:00:00.484) 0:20:14.572 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml for managed-node9 TASK [Get information about VDO deduplication] ********************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:8 Saturday 21 March 2026 19:17:04 -0400 (0:00:00.390) 0:20:14.962 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:15 Saturday 21 March 2026 19:17:04 -0400 (0:00:00.270) 0:20:15.233 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:21 Saturday 21 March 2026 19:17:04 -0400 (0:00:00.289) 0:20:15.523 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about VDO compression] *********************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:27 Saturday 21 March 2026 19:17:04 -0400 (0:00:00.256) 0:20:15.779 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off - 2] *********************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:34 Saturday 21 March 2026 19:17:05 -0400 (0:00:00.238) 0:20:16.018 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on - 2] ************************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:40 Saturday 21 March 2026 19:17:05 -0400 (0:00:00.393) 0:20:16.411 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:46 Saturday 21 March 2026 19:17:05 -0400 (0:00:00.287) 0:20:16.699 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:103 Saturday 21 March 2026 19:17:06 -0400 (0:00:00.270) 0:20:16.970 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node9 TASK [Get stratis pool information] ******************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Saturday 21 March 2026 19:17:06 -0400 (0:00:00.473) 0:20:17.443 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print script output] ***************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Saturday 21 March 2026 19:17:06 -0400 (0:00:00.203) 0:20:17.646 ******** skipping: [managed-node9] => {} TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:19 Saturday 21 March 2026 19:17:07 -0400 (0:00:00.301) 0:20:17.948 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:23 Saturday 21 March 2026 19:17:07 -0400 (0:00:00.271) 0:20:18.219 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:30 Saturday 21 March 2026 19:17:07 -0400 (0:00:00.198) 0:20:18.417 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:39 Saturday 21 March 2026 19:17:07 -0400 (0:00:00.235) 0:20:18.653 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:49 Saturday 21 March 2026 19:17:08 -0400 (0:00:00.219) 0:20:18.872 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:106 Saturday 21 March 2026 19:17:08 -0400 (0:00:00.123) 0:20:18.996 ******** ok: [managed-node9] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Saturday 21 March 2026 19:17:08 -0400 (0:00:00.151) 0:20:19.148 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node9 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Saturday 21 March 2026 19:17:08 -0400 (0:00:00.325) 0:20:19.473 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Saturday 21 March 2026 19:17:08 -0400 (0:00:00.182) 0:20:19.655 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node9 included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node9 included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node9 included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node9 included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node9 included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node9 included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node9 included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node9 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Saturday 21 March 2026 19:17:09 -0400 (0:00:01.054) 0:20:20.709 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Saturday 21 March 2026 19:17:10 -0400 (0:00:00.247) 0:20:20.957 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Saturday 21 March 2026 19:17:10 -0400 (0:00:00.212) 0:20:21.169 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Saturday 21 March 2026 19:17:10 -0400 (0:00:00.275) 0:20:21.444 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Saturday 21 March 2026 19:17:10 -0400 (0:00:00.365) 0:20:21.809 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Saturday 21 March 2026 19:17:11 -0400 (0:00:00.273) 0:20:22.083 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Saturday 21 March 2026 19:17:11 -0400 (0:00:00.311) 0:20:22.395 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Saturday 21 March 2026 19:17:11 -0400 (0:00:00.275) 0:20:22.670 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Saturday 21 March 2026 19:17:12 -0400 (0:00:00.216) 0:20:22.886 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Saturday 21 March 2026 19:17:12 -0400 (0:00:00.217) 0:20:23.104 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Saturday 21 March 2026 19:17:12 -0400 (0:00:00.244) 0:20:23.349 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Saturday 21 March 2026 19:17:12 -0400 (0:00:00.193) 0:20:23.542 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/foo-test1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Saturday 21 March 2026 19:17:13 -0400 (0:00:00.470) 0:20:24.012 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Saturday 21 March 2026 19:17:13 -0400 (0:00:00.266) 0:20:24.279 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Saturday 21 March 2026 19:17:13 -0400 (0:00:00.313) 0:20:24.592 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Saturday 21 March 2026 19:17:14 -0400 (0:00:00.295) 0:20:24.888 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Saturday 21 March 2026 19:17:14 -0400 (0:00:00.253) 0:20:25.142 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Saturday 21 March 2026 19:17:14 -0400 (0:00:00.280) 0:20:25.422 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Saturday 21 March 2026 19:17:14 -0400 (0:00:00.370) 0:20:25.793 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Saturday 21 March 2026 19:17:15 -0400 (0:00:00.343) 0:20:26.137 ******** ok: [managed-node9] => { "changed": false, "stat": { "atime": 1774134974.9056149, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1774134974.9056149, "dev": 6, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 298511, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1774134974.9056149, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Saturday 21 March 2026 19:17:16 -0400 (0:00:01.418) 0:20:27.555 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Saturday 21 March 2026 19:17:16 -0400 (0:00:00.233) 0:20:27.788 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Saturday 21 March 2026 19:17:17 -0400 (0:00:00.112) 0:20:27.901 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Saturday 21 March 2026 19:17:17 -0400 (0:00:00.215) 0:20:28.116 ******** ok: [managed-node9] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Saturday 21 March 2026 19:17:17 -0400 (0:00:00.236) 0:20:28.353 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Saturday 21 March 2026 19:17:17 -0400 (0:00:00.220) 0:20:28.573 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Saturday 21 March 2026 19:17:18 -0400 (0:00:00.388) 0:20:28.961 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Saturday 21 March 2026 19:17:18 -0400 (0:00:00.245) 0:20:29.207 ******** ok: [managed-node9] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Saturday 21 March 2026 19:17:22 -0400 (0:00:04.301) 0:20:33.509 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Saturday 21 March 2026 19:17:22 -0400 (0:00:00.145) 0:20:33.655 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Saturday 21 March 2026 19:17:23 -0400 (0:00:00.211) 0:20:33.866 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Saturday 21 March 2026 19:17:23 -0400 (0:00:00.240) 0:20:34.107 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Saturday 21 March 2026 19:17:23 -0400 (0:00:00.231) 0:20:34.338 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Saturday 21 March 2026 19:17:23 -0400 (0:00:00.249) 0:20:34.587 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Saturday 21 March 2026 19:17:23 -0400 (0:00:00.238) 0:20:34.825 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Saturday 21 March 2026 19:17:24 -0400 (0:00:00.142) 0:20:34.968 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Saturday 21 March 2026 19:17:24 -0400 (0:00:00.105) 0:20:35.074 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Saturday 21 March 2026 19:17:24 -0400 (0:00:00.070) 0:20:35.144 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Saturday 21 March 2026 19:17:24 -0400 (0:00:00.064) 0:20:35.209 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Saturday 21 March 2026 19:17:24 -0400 (0:00:00.110) 0:20:35.320 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Saturday 21 March 2026 19:17:24 -0400 (0:00:00.166) 0:20:35.486 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Saturday 21 March 2026 19:17:24 -0400 (0:00:00.197) 0:20:35.683 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Saturday 21 March 2026 19:17:25 -0400 (0:00:00.162) 0:20:35.846 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Saturday 21 March 2026 19:17:25 -0400 (0:00:00.212) 0:20:36.058 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Saturday 21 March 2026 19:17:25 -0400 (0:00:00.172) 0:20:36.230 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Saturday 21 March 2026 19:17:25 -0400 (0:00:00.195) 0:20:36.426 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Saturday 21 March 2026 19:17:25 -0400 (0:00:00.128) 0:20:36.555 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Saturday 21 March 2026 19:17:25 -0400 (0:00:00.214) 0:20:36.769 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Saturday 21 March 2026 19:17:26 -0400 (0:00:00.138) 0:20:36.907 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Saturday 21 March 2026 19:17:26 -0400 (0:00:00.213) 0:20:37.121 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Saturday 21 March 2026 19:17:26 -0400 (0:00:00.156) 0:20:37.278 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Saturday 21 March 2026 19:17:26 -0400 (0:00:00.182) 0:20:37.460 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Saturday 21 March 2026 19:17:26 -0400 (0:00:00.121) 0:20:37.582 ******** ok: [managed-node9] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Saturday 21 March 2026 19:17:28 -0400 (0:00:01.380) 0:20:38.963 ******** ok: [managed-node9] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Saturday 21 March 2026 19:17:29 -0400 (0:00:01.530) 0:20:40.493 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [Show expected size] ****************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Saturday 21 March 2026 19:17:29 -0400 (0:00:00.211) 0:20:40.705 ******** ok: [managed-node9] => { "storage_test_expected_size": "4294967296" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Saturday 21 March 2026 19:17:30 -0400 (0:00:00.174) 0:20:40.879 ******** ok: [managed-node9] => { "bytes": 10726680821, "changed": false, "lvm": "9g", "parted": "9GiB", "size": "9 GiB" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Saturday 21 March 2026 19:17:31 -0400 (0:00:01.446) 0:20:42.326 ******** skipping: [managed-node9] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Saturday 21 March 2026 19:17:31 -0400 (0:00:00.249) 0:20:42.576 ******** skipping: [managed-node9] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Saturday 21 March 2026 19:17:31 -0400 (0:00:00.154) 0:20:42.730 ******** skipping: [managed-node9] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Saturday 21 March 2026 19:17:32 -0400 (0:00:00.207) 0:20:42.937 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Saturday 21 March 2026 19:17:32 -0400 (0:00:00.235) 0:20:43.173 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Saturday 21 March 2026 19:17:32 -0400 (0:00:00.200) 0:20:43.373 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Saturday 21 March 2026 19:17:32 -0400 (0:00:00.233) 0:20:43.606 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Saturday 21 March 2026 19:17:32 -0400 (0:00:00.194) 0:20:43.801 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Saturday 21 March 2026 19:17:33 -0400 (0:00:00.152) 0:20:43.953 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Saturday 21 March 2026 19:17:33 -0400 (0:00:00.123) 0:20:44.077 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Saturday 21 March 2026 19:17:33 -0400 (0:00:00.163) 0:20:44.240 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Saturday 21 March 2026 19:17:33 -0400 (0:00:00.225) 0:20:44.466 ******** skipping: [managed-node9] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Saturday 21 March 2026 19:17:33 -0400 (0:00:00.153) 0:20:44.619 ******** skipping: [managed-node9] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Saturday 21 March 2026 19:17:33 -0400 (0:00:00.167) 0:20:44.787 ******** skipping: [managed-node9] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Saturday 21 March 2026 19:17:34 -0400 (0:00:00.174) 0:20:44.961 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Saturday 21 March 2026 19:17:34 -0400 (0:00:00.145) 0:20:45.106 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Saturday 21 March 2026 19:17:34 -0400 (0:00:00.251) 0:20:45.358 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Saturday 21 March 2026 19:17:34 -0400 (0:00:00.201) 0:20:45.560 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Saturday 21 March 2026 19:17:34 -0400 (0:00:00.162) 0:20:45.723 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Saturday 21 March 2026 19:17:35 -0400 (0:00:00.125) 0:20:45.849 ******** ok: [managed-node9] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Saturday 21 March 2026 19:17:35 -0400 (0:00:00.193) 0:20:46.042 ******** ok: [managed-node9] => { "storage_test_expected_size": "4294967296" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Saturday 21 March 2026 19:17:35 -0400 (0:00:00.196) 0:20:46.239 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Saturday 21 March 2026 19:17:35 -0400 (0:00:00.265) 0:20:46.505 ******** ok: [managed-node9] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.023002", "end": "2026-03-21 19:17:36.927672", "rc": 0, "start": "2026-03-21 19:17:36.904670" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Saturday 21 March 2026 19:17:37 -0400 (0:00:01.501) 0:20:48.006 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [Check segment type] ****************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Saturday 21 March 2026 19:17:37 -0400 (0:00:00.255) 0:20:48.262 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Saturday 21 March 2026 19:17:37 -0400 (0:00:00.186) 0:20:48.448 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Saturday 21 March 2026 19:17:37 -0400 (0:00:00.183) 0:20:48.632 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Saturday 21 March 2026 19:17:37 -0400 (0:00:00.186) 0:20:48.818 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Saturday 21 March 2026 19:17:38 -0400 (0:00:00.203) 0:20:49.021 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Saturday 21 March 2026 19:17:38 -0400 (0:00:00.147) 0:20:49.169 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Saturday 21 March 2026 19:17:38 -0400 (0:00:00.131) 0:20:49.300 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Saturday 21 March 2026 19:17:38 -0400 (0:00:00.170) 0:20:49.470 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Saturday 21 March 2026 19:17:38 -0400 (0:00:00.124) 0:20:49.595 ******** changed: [managed-node9] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode - 6] ****************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:461 Saturday 21 March 2026 19:17:40 -0400 (0:00:01.375) 0:20:50.970 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node9 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Saturday 21 March 2026 19:17:40 -0400 (0:00:00.287) 0:20:51.258 ******** ok: [managed-node9] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Saturday 21 March 2026 19:17:40 -0400 (0:00:00.194) 0:20:51.452 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node9 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:24 Saturday 21 March 2026 19:17:40 -0400 (0:00:00.237) 0:20:51.690 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:34 Saturday 21 March 2026 19:17:40 -0400 (0:00:00.111) 0:20:51.802 ******** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Saturday 21 March 2026 19:17:41 -0400 (0:00:00.303) 0:20:52.105 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Saturday 21 March 2026 19:17:41 -0400 (0:00:00.202) 0:20:52.307 ******** ok: [managed-node9] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Saturday 21 March 2026 19:17:43 -0400 (0:00:02.106) 0:20:54.414 ******** skipping: [managed-node9] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node9] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node9] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node9] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Saturday 21 March 2026 19:17:43 -0400 (0:00:00.404) 0:20:54.818 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Saturday 21 March 2026 19:17:44 -0400 (0:00:00.303) 0:20:55.122 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Saturday 21 March 2026 19:17:44 -0400 (0:00:00.295) 0:20:55.418 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Saturday 21 March 2026 19:17:44 -0400 (0:00:00.140) 0:20:55.558 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Saturday 21 March 2026 19:17:44 -0400 (0:00:00.145) 0:20:55.704 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Saturday 21 March 2026 19:17:45 -0400 (0:00:00.531) 0:20:56.235 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Saturday 21 March 2026 19:17:45 -0400 (0:00:00.204) 0:20:56.439 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Saturday 21 March 2026 19:17:45 -0400 (0:00:00.041) 0:20:56.481 ******** ok: [managed-node9] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kmod-kvdo libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python3-blivet stratis-cli stratisd vdo xfsprogs TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Saturday 21 March 2026 19:17:49 -0400 (0:00:03.565) 0:21:00.046 ******** ok: [managed-node9] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Saturday 21 March 2026 19:17:49 -0400 (0:00:00.110) 0:21:00.157 ******** ok: [managed-node9] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Saturday 21 March 2026 19:17:49 -0400 (0:00:00.161) 0:21:00.318 ******** ok: [managed-node9] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup", "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Saturday 21 March 2026 19:17:54 -0400 (0:00:05.090) 0:21:05.408 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Saturday 21 March 2026 19:17:54 -0400 (0:00:00.122) 0:21:05.531 ******** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Saturday 21 March 2026 19:17:54 -0400 (0:00:00.079) 0:21:05.610 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Saturday 21 March 2026 19:17:54 -0400 (0:00:00.036) 0:21:05.646 ******** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Saturday 21 March 2026 19:17:54 -0400 (0:00:00.117) 0:21:05.764 ******** ok: [managed-node9] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup kpartx lvm2 TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Saturday 21 March 2026 19:17:58 -0400 (0:00:03.239) 0:21:09.003 ******** ok: [managed-node9] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luk...d71d6\\x2d43a0\\x2daacd\\x2de66622f7a65f.service": { "name": "systemd-cryptsetup@luk...d71d6\\x2d43a0\\x2daacd\\x2de66622f7a65f.service", "source": "systemd", "state": "inactive", "status": "generated" }, "systemd-cryptsetup@luks\\x2df3a44b09\\x2d71d6\\x2d43a0\\x2daacd\\x2de66622f7a65f.service": { "name": "systemd-cryptsetup@luks\\x2df3a44b09\\x2d71d6\\x2d43a0\\x2daacd\\x2de66622f7a65f.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Saturday 21 March 2026 19:18:00 -0400 (0:00:02.188) 0:21:11.192 ******** changed: [managed-node9] => (item=systemd-cryptsetup@luks\x2df3a44b09\x2d71d6\x2d43a0\x2daacd\x2de66622f7a65f.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2df3a44b09\\x2d71d6\\x2d43a0\\x2daacd\\x2de66622f7a65f.service", "name": "systemd-cryptsetup@luks\\x2df3a44b09\\x2d71d6\\x2d43a0\\x2daacd\\x2de66622f7a65f.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "dev-mapper-foo\\x2dtest1.device systemd-journald.socket system-systemd\\x2dcryptsetup.slice cryptsetup-pre.target", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "cryptsetup.target umount.target", "BindsTo": "dev-mapper-foo\\x2dtest1.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-f3a44b09-71d6-43a0-aacd-e66622f7a65f", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-f3a44b09-71d6-43a0-aacd-e66622f7a65f /dev/mapper/foo-test1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-f3a44b09-71d6-43a0-aacd-e66622f7a65f ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2df3a44b09\\x2d71d6\\x2d43a0\\x2daacd\\x2de66622f7a65f.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2df3a44b09\\x2d71d6\\x2d43a0\\x2daacd\\x2de66622f7a65f.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "0", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "65536", "LimitMEMLOCKSoft": "65536", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "262144", "LimitNOFILESoft": "1024", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2df3a44b09\\x2d71d6\\x2d43a0\\x2daacd\\x2de66622f7a65f.service", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Sat 2026-03-21 19:13:44 EDT", "StateChangeTimestampMonotonic": "2587392610", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "infinity", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "WantedBy": "dev-mapper-foo\\x2dtest1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node9] => (item=systemd-cryptsetup@luk...d71d6\x2d43a0\x2daacd\x2de66622f7a65f.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...d71d6\\x2d43a0\\x2daacd\\x2de66622f7a65f.service", "name": "systemd-cryptsetup@luk...d71d6\\x2d43a0\\x2daacd\\x2de66622f7a65f.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...d71d6\\x2d43a0\\x2daacd\\x2de66622f7a65f.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...d71d6\\x2d43a0\\x2daacd\\x2de66622f7a65f.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.NoSuchUnit \"Unit systemd-cryptsetup@luk...d71d6\\x2d43a0\\x2daacd\\x2de66622f7a65f.service not found.\"", "LoadState": "not-found", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...d71d6\\x2d43a0\\x2daacd\\x2de66622f7a65f.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Saturday 21 March 2026 19:18:02 -0400 (0:00:02.181) 0:21:13.374 ******** fatal: [managed-node9]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'test1' in safe mode due to adding encryption TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129 Saturday 21 March 2026 19:18:07 -0400 (0:00:04.633) 0:21:18.007 ******** fatal: [managed-node9]: FAILED! => { "changed": false } MSG: {'msg': "cannot remove existing formatting on device 'test1' in safe mode due to adding encryption", 'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'invocation': {'module_args': {'pools': [{'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'lvm', 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None}]}], 'volumes': [], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': True, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Saturday 21 March 2026 19:18:07 -0400 (0:00:00.195) 0:21:18.203 ******** changed: [managed-node9] => (item=systemd-cryptsetup@luks\x2df3a44b09\x2d71d6\x2d43a0\x2daacd\x2de66622f7a65f.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2df3a44b09\\x2d71d6\\x2d43a0\\x2daacd\\x2de66622f7a65f.service", "name": "systemd-cryptsetup@luks\\x2df3a44b09\\x2d71d6\\x2d43a0\\x2daacd\\x2de66622f7a65f.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2df3a44b09\\x2d71d6\\x2d43a0\\x2daacd\\x2de66622f7a65f.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luks\\x2df3a44b09\\x2d71d6\\x2d43a0\\x2daacd\\x2de66622f7a65f.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2df3a44b09\\x2d71d6\\x2d43a0\\x2daacd\\x2de66622f7a65f.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luks\\x2df3a44b09\\x2d71d6\\x2d43a0\\x2daacd\\x2de66622f7a65f.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } changed: [managed-node9] => (item=systemd-cryptsetup@luk...d71d6\x2d43a0\x2daacd\x2de66622f7a65f.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luk...d71d6\\x2d43a0\\x2daacd\\x2de66622f7a65f.service", "name": "systemd-cryptsetup@luk...d71d6\\x2d43a0\\x2daacd\\x2de66622f7a65f.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AllowedCPUs": "", "AllowedMemoryNodes": "", "AmbientCapabilities": "", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "no", "CPUAffinity": "", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlPID": "0", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luk...d71d6\\x2d43a0\\x2daacd\\x2de66622f7a65f.service", "DevicePolicy": "auto", "DynamicUser": "no", "EffectiveCPUs": "", "EffectiveMemoryNodes": "", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOSchedulingClass": "0", "IOSchedulingPriority": "0", "IOWeight": "[not set]", "IPAccounting": "no", "IPEgressBytes": "18446744073709551615", "IPEgressPackets": "18446744073709551615", "IPIngressBytes": "18446744073709551615", "IPIngressPackets": "18446744073709551615", "Id": "systemd-cryptsetup@luk...d71d6\\x2d43a0\\x2daacd\\x2de66622f7a65f.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "67108864", "LimitMEMLOCKSoft": "67108864", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1048576", "LimitNOFILESoft": "1048576", "LimitNPROC": "14003", "LimitNPROCSoft": "14003", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "14003", "LimitSIGPENDINGSoft": "14003", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luk...d71d6\\x2d43a0\\x2daacd\\x2de66622f7a65f.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "MemoryAccounting": "yes", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "MountFlags": "", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAMask": "", "NUMAPolicy": "n/a", "Names": "systemd-cryptsetup@luk...d71d6\\x2d43a0\\x2daacd\\x2de66622f7a65f.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "Perpetual": "no", "PrivateDevices": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardInputData": "", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "22405", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:22 Saturday 21 March 2026 19:18:10 -0400 (0:00:03.012) 0:21:21.216 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:27 Saturday 21 March 2026 19:18:10 -0400 (0:00:00.129) 0:21:21.346 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:38 Saturday 21 March 2026 19:18:10 -0400 (0:00:00.068) 0:21:21.415 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Saturday 21 March 2026 19:18:10 -0400 (0:00:00.080) 0:21:21.495 ******** ok: [managed-node9] => { "changed": false, "stat": { "atime": 1774135059.9200993, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1774135059.9200993, "dev": 64768, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1774135059.9200993, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "3494559426", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Saturday 21 March 2026 19:18:11 -0400 (0:00:00.783) 0:21:22.279 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Add encryption to the volume - 3] **************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:484 Saturday 21 March 2026 19:18:11 -0400 (0:00:00.066) 0:21:22.346 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node9 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:24 Saturday 21 March 2026 19:18:11 -0400 (0:00:00.209) 0:21:22.555 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:34 Saturday 21 March 2026 19:18:11 -0400 (0:00:00.092) 0:21:22.648 ******** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Saturday 21 March 2026 19:18:11 -0400 (0:00:00.110) 0:21:22.759 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Saturday 21 March 2026 19:18:11 -0400 (0:00:00.077) 0:21:22.836 ******** ok: [managed-node9] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Saturday 21 March 2026 19:18:13 -0400 (0:00:01.096) 0:21:23.933 ******** skipping: [managed-node9] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node9] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node9] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node9] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Saturday 21 March 2026 19:18:13 -0400 (0:00:00.370) 0:21:24.303 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Saturday 21 March 2026 19:18:13 -0400 (0:00:00.177) 0:21:24.481 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Saturday 21 March 2026 19:18:13 -0400 (0:00:00.115) 0:21:24.596 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Saturday 21 March 2026 19:18:13 -0400 (0:00:00.056) 0:21:24.653 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Saturday 21 March 2026 19:18:13 -0400 (0:00:00.065) 0:21:24.718 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Saturday 21 March 2026 19:18:14 -0400 (0:00:00.210) 0:21:24.929 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Saturday 21 March 2026 19:18:14 -0400 (0:00:00.096) 0:21:25.025 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Saturday 21 March 2026 19:18:14 -0400 (0:00:00.199) 0:21:25.225 ******** ok: [managed-node9] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kmod-kvdo libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python3-blivet stratis-cli stratisd vdo xfsprogs TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Saturday 21 March 2026 19:18:17 -0400 (0:00:03.566) 0:21:28.791 ******** ok: [managed-node9] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Saturday 21 March 2026 19:18:18 -0400 (0:00:00.235) 0:21:29.027 ******** ok: [managed-node9] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Saturday 21 March 2026 19:18:18 -0400 (0:00:00.187) 0:21:29.215 ******** ok: [managed-node9] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup", "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Saturday 21 March 2026 19:18:23 -0400 (0:00:04.972) 0:21:34.187 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Saturday 21 March 2026 19:18:23 -0400 (0:00:00.175) 0:21:34.363 ******** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Saturday 21 March 2026 19:18:23 -0400 (0:00:00.068) 0:21:34.431 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Saturday 21 March 2026 19:18:23 -0400 (0:00:00.090) 0:21:34.521 ******** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Saturday 21 March 2026 19:18:23 -0400 (0:00:00.111) 0:21:34.633 ******** ok: [managed-node9] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup kpartx lvm2 TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Saturday 21 March 2026 19:18:27 -0400 (0:00:03.767) 0:21:38.401 ******** ok: [managed-node9] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Saturday 21 March 2026 19:18:30 -0400 (0:00:03.186) 0:21:41.588 ******** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Saturday 21 March 2026 19:18:31 -0400 (0:00:00.332) 0:21:41.921 ******** changed: [managed-node9] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-0cea85bc-813c-428a-b38b-dec5e406e24d", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-0cea85bc-813c-428a-b38b-dec5e406e24d", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-0cea85bc-813c-428a-b38b-dec5e406e24d", "password": "-", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-0cea85bc-813c-428a-b38b-dec5e406e24d" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-0cea85bc-813c-428a-b38b-dec5e406e24d", "state": "mounted" } ], "packages": [ "lvm2", "xfsprogs", "cryptsetup" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-0cea85bc-813c-428a-b38b-dec5e406e24d", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-0cea85bc-813c-428a-b38b-dec5e406e24d", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:103 Saturday 21 March 2026 19:18:44 -0400 (0:00:13.297) 0:21:55.218 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:110 Saturday 21 March 2026 19:18:44 -0400 (0:00:00.261) 0:21:55.480 ******** ok: [managed-node9] => { "changed": false, "stat": { "atime": 1774134986.9956837, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "a1522684f5b6a445a50f2611a4e0757a4aec1cf1", "ctime": 1774134986.9926836, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 364904581, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1774134986.9926836, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1393, "uid": 0, "version": "133446622", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Saturday 21 March 2026 19:18:45 -0400 (0:00:01.363) 0:21:56.844 ******** ok: [managed-node9] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Saturday 21 March 2026 19:18:47 -0400 (0:00:01.373) 0:21:58.217 ******** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:139 Saturday 21 March 2026 19:18:47 -0400 (0:00:00.281) 0:21:58.499 ******** ok: [managed-node9] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-0cea85bc-813c-428a-b38b-dec5e406e24d", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-0cea85bc-813c-428a-b38b-dec5e406e24d", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-0cea85bc-813c-428a-b38b-dec5e406e24d", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-0cea85bc-813c-428a-b38b-dec5e406e24d" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-0cea85bc-813c-428a-b38b-dec5e406e24d", "state": "mounted" } ], "packages": [ "lvm2", "xfsprogs", "cryptsetup" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-0cea85bc-813c-428a-b38b-dec5e406e24d", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-0cea85bc-813c-428a-b38b-dec5e406e24d", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Saturday 21 March 2026 19:18:47 -0400 (0:00:00.294) 0:21:58.794 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-0cea85bc-813c-428a-b38b-dec5e406e24d", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-0cea85bc-813c-428a-b38b-dec5e406e24d", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:152 Saturday 21 March 2026 19:18:48 -0400 (0:00:00.254) 0:21:59.049 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:168 Saturday 21 March 2026 19:18:48 -0400 (0:00:00.298) 0:21:59.347 ******** changed: [managed-node9] => (item={'src': '/dev/mapper/foo-test1', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:179 Saturday 21 March 2026 19:18:50 -0400 (0:00:01.647) 0:22:00.994 ******** ok: [managed-node9] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:184 Saturday 21 March 2026 19:18:52 -0400 (0:00:01.902) 0:22:02.897 ******** changed: [managed-node9] => (item={'src': '/dev/mapper/luks-0cea85bc-813c-428a-b38b-dec5e406e24d', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-0cea85bc-813c-428a-b38b-dec5e406e24d", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-0cea85bc-813c-428a-b38b-dec5e406e24d" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Saturday 21 March 2026 19:18:53 -0400 (0:00:01.277) 0:22:04.174 ******** skipping: [managed-node9] => (item={'src': '/dev/mapper/luks-0cea85bc-813c-428a-b38b-dec5e406e24d', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-0cea85bc-813c-428a-b38b-dec5e406e24d", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:207 Saturday 21 March 2026 19:18:53 -0400 (0:00:00.268) 0:22:04.443 ******** ok: [managed-node9] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:215 Saturday 21 March 2026 19:18:55 -0400 (0:00:01.691) 0:22:06.135 ******** ok: [managed-node9] => { "changed": false, "stat": { "atime": 1774135000.6297615, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1774134992.3317142, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 352321687, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1774134992.329714, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "879203226", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:220 Saturday 21 March 2026 19:18:56 -0400 (0:00:01.440) 0:22:07.576 ******** changed: [managed-node9] => (item={'backing_device': '/dev/mapper/foo-test1', 'name': 'luks-0cea85bc-813c-428a-b38b-dec5e406e24d', 'password': '-', 'state': 'present'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/mapper/foo-test1", "name": "luks-0cea85bc-813c-428a-b38b-dec5e406e24d", "password": "-", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:242 Saturday 21 March 2026 19:18:58 -0400 (0:00:01.389) 0:22:08.965 ******** ok: [managed-node9] TASK [Verify role results - 10] ************************************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:499 Saturday 21 March 2026 19:19:00 -0400 (0:00:02.072) 0:22:11.037 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node9 TASK [Print out pool information] ********************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Saturday 21 March 2026 19:19:00 -0400 (0:00:00.457) 0:22:11.495 ******** ok: [managed-node9] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-0cea85bc-813c-428a-b38b-dec5e406e24d", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-0cea85bc-813c-428a-b38b-dec5e406e24d", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Saturday 21 March 2026 19:19:00 -0400 (0:00:00.213) 0:22:11.709 ******** skipping: [managed-node9] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Saturday 21 March 2026 19:19:01 -0400 (0:00:00.223) 0:22:11.933 ******** ok: [managed-node9] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "0cea85bc-813c-428a-b38b-dec5e406e24d" }, "/dev/mapper/luks-0cea85bc-813c-428a-b38b-dec5e406e24d": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-0cea85bc-813c-428a-b38b-dec5e406e24d", "size": "4G", "type": "crypt", "uuid": "a6b8e83c-922b-45cc-b256-b68886108789" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "FV4NCV-y18g-hE3Q-w15a-CQO8-jg2d-pQkOwv" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Saturday 21 March 2026 19:19:02 -0400 (0:00:01.363) 0:22:13.296 ******** ok: [managed-node9] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002674", "end": "2026-03-21 19:19:03.575278", "rc": 0, "start": "2026-03-21 19:19:03.572604" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-0cea85bc-813c-428a-b38b-dec5e406e24d /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Saturday 21 March 2026 19:19:03 -0400 (0:00:01.501) 0:22:14.798 ******** ok: [managed-node9] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002604", "end": "2026-03-21 19:19:05.068860", "failed_when_result": false, "rc": 0, "start": "2026-03-21 19:19:05.066256" } STDOUT: luks-0cea85bc-813c-428a-b38b-dec5e406e24d /dev/mapper/foo-test1 - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Saturday 21 March 2026 19:19:05 -0400 (0:00:01.325) 0:22:16.123 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node9 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Saturday 21 March 2026 19:19:05 -0400 (0:00:00.299) 0:22:16.422 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Saturday 21 March 2026 19:19:05 -0400 (0:00:00.070) 0:22:16.493 ******** ok: [managed-node9] => { "changed": false, "cmd": [ "vgs", "--noheadings", "--binary", "-o", "shared", "foo" ], "delta": "0:00:00.024999", "end": "2026-03-21 19:19:06.706734", "rc": 0, "start": "2026-03-21 19:19:06.681735" } STDOUT: 0 TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Saturday 21 March 2026 19:19:06 -0400 (0:00:01.253) 0:22:17.746 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Saturday 21 March 2026 19:19:07 -0400 (0:00:00.345) 0:22:18.091 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node9 included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node9 TASK [Set test variables] ****************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Saturday 21 March 2026 19:19:07 -0400 (0:00:00.464) 0:22:18.556 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Saturday 21 March 2026 19:19:07 -0400 (0:00:00.222) 0:22:18.779 ******** ok: [managed-node9] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Saturday 21 March 2026 19:19:09 -0400 (0:00:01.415) 0:22:20.195 ******** ok: [managed-node9] => { "ansible_facts": { "__pvs_lvm_len": "1" }, "changed": false } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Saturday 21 March 2026 19:19:09 -0400 (0:00:00.314) 0:22:20.509 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "changed": false } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Saturday 21 March 2026 19:19:09 -0400 (0:00:00.227) 0:22:20.737 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Set expected pv type] **************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Saturday 21 March 2026 19:19:10 -0400 (0:00:00.325) 0:22:21.062 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type - 2] ************************************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Saturday 21 March 2026 19:19:10 -0400 (0:00:00.284) 0:22:21.347 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type - 3] ************************************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Saturday 21 March 2026 19:19:10 -0400 (0:00:00.267) 0:22:21.614 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:55 Saturday 21 March 2026 19:19:10 -0400 (0:00:00.188) 0:22:21.802 ******** ok: [managed-node9] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:68 Saturday 21 March 2026 19:19:11 -0400 (0:00:00.329) 0:22:22.132 ******** ok: [managed-node9] => { "changed": false, "failed_when_result": false, "rc": 1 } STDERR: Shared connection to 10.31.10.97 closed. MSG: non-zero return code TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:78 Saturday 21 March 2026 19:19:12 -0400 (0:00:01.111) 0:22:23.244 ******** skipping: [managed-node9] => (item=/dev/sda) => { "ansible_loop_var": "st_pool_pv", "changed": false, "skip_reason": "Conditional result was False", "st_pool_pv": "/dev/sda" } TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:88 Saturday 21 March 2026 19:19:12 -0400 (0:00:00.263) 0:22:23.507 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node9 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Saturday 21 March 2026 19:19:13 -0400 (0:00:00.502) 0:22:24.010 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Saturday 21 March 2026 19:19:13 -0400 (0:00:00.352) 0:22:24.362 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Saturday 21 March 2026 19:19:13 -0400 (0:00:00.170) 0:22:24.533 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Saturday 21 March 2026 19:19:13 -0400 (0:00:00.237) 0:22:24.771 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Saturday 21 March 2026 19:19:14 -0400 (0:00:00.212) 0:22:24.983 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Saturday 21 March 2026 19:19:14 -0400 (0:00:00.150) 0:22:25.134 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Saturday 21 March 2026 19:19:14 -0400 (0:00:00.145) 0:22:25.279 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Saturday 21 March 2026 19:19:14 -0400 (0:00:00.178) 0:22:25.457 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Saturday 21 March 2026 19:19:14 -0400 (0:00:00.224) 0:22:25.682 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Saturday 21 March 2026 19:19:15 -0400 (0:00:00.265) 0:22:25.947 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Saturday 21 March 2026 19:19:15 -0400 (0:00:00.182) 0:22:26.129 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:91 Saturday 21 March 2026 19:19:15 -0400 (0:00:00.104) 0:22:26.234 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node9 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Saturday 21 March 2026 19:19:15 -0400 (0:00:00.474) 0:22:26.709 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml for managed-node9 TASK [Get information about the LV] ******************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:8 Saturday 21 March 2026 19:19:16 -0400 (0:00:00.487) 0:22:27.196 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:16 Saturday 21 March 2026 19:19:16 -0400 (0:00:00.326) 0:22:27.522 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:20 Saturday 21 March 2026 19:19:17 -0400 (0:00:00.341) 0:22:27.864 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV stripe size] ****************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:27 Saturday 21 March 2026 19:19:17 -0400 (0:00:00.310) 0:22:28.175 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested stripe size] ***************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:31 Saturday 21 March 2026 19:19:17 -0400 (0:00:00.273) 0:22:28.448 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected stripe size] ************************************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:37 Saturday 21 March 2026 19:19:17 -0400 (0:00:00.356) 0:22:28.805 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check stripe size] ******************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:42 Saturday 21 March 2026 19:19:18 -0400 (0:00:00.262) 0:22:29.068 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:94 Saturday 21 March 2026 19:19:18 -0400 (0:00:00.317) 0:22:29.385 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node9 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Saturday 21 March 2026 19:19:19 -0400 (0:00:00.494) 0:22:29.879 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml for managed-node9 TASK [Get information about thinpool] ****************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:8 Saturday 21 March 2026 19:19:19 -0400 (0:00:00.347) 0:22:30.227 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in correct thinpool (when thinp name is provided)] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:16 Saturday 21 March 2026 19:19:19 -0400 (0:00:00.289) 0:22:30.517 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in thinpool (when thinp name is not provided)] ****** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:22 Saturday 21 March 2026 19:19:19 -0400 (0:00:00.272) 0:22:30.789 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:26 Saturday 21 March 2026 19:19:20 -0400 (0:00:00.193) 0:22:30.983 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_thin_status": null }, "changed": false } TASK [Check member encryption] ************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:97 Saturday 21 March 2026 19:19:20 -0400 (0:00:00.147) 0:22:31.130 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node9 TASK [Set test variables] ****************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Saturday 21 March 2026 19:19:20 -0400 (0:00:00.460) 0:22:31.590 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Saturday 21 March 2026 19:19:21 -0400 (0:00:00.274) 0:22:31.865 ******** skipping: [managed-node9] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Saturday 21 March 2026 19:19:21 -0400 (0:00:00.252) 0:22:32.117 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml for managed-node9 TASK [Set variables used by tests] ********************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:2 Saturday 21 March 2026 19:19:21 -0400 (0:00:00.454) 0:22:32.572 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:6 Saturday 21 March 2026 19:19:21 -0400 (0:00:00.240) 0:22:32.812 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:14 Saturday 21 March 2026 19:19:22 -0400 (0:00:00.275) 0:22:33.088 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:23 Saturday 21 March 2026 19:19:22 -0400 (0:00:00.326) 0:22:33.414 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:32 Saturday 21 March 2026 19:19:22 -0400 (0:00:00.274) 0:22:33.688 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:41 Saturday 21 March 2026 19:19:22 -0400 (0:00:00.118) 0:22:33.807 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [Clear test variables] **************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Saturday 21 March 2026 19:19:23 -0400 (0:00:00.164) 0:22:33.971 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:100 Saturday 21 March 2026 19:19:23 -0400 (0:00:00.181) 0:22:34.153 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node9 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Saturday 21 March 2026 19:19:23 -0400 (0:00:00.467) 0:22:34.620 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml for managed-node9 TASK [Get information about VDO deduplication] ********************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:8 Saturday 21 March 2026 19:19:24 -0400 (0:00:00.299) 0:22:34.920 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:15 Saturday 21 March 2026 19:19:24 -0400 (0:00:00.137) 0:22:35.057 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:21 Saturday 21 March 2026 19:19:24 -0400 (0:00:00.281) 0:22:35.339 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about VDO compression] *********************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:27 Saturday 21 March 2026 19:19:25 -0400 (0:00:01.066) 0:22:36.405 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off - 2] *********************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:34 Saturday 21 March 2026 19:19:25 -0400 (0:00:00.230) 0:22:36.636 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on - 2] ************************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:40 Saturday 21 March 2026 19:19:26 -0400 (0:00:00.283) 0:22:36.919 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:46 Saturday 21 March 2026 19:19:26 -0400 (0:00:00.314) 0:22:37.234 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:103 Saturday 21 March 2026 19:19:26 -0400 (0:00:00.349) 0:22:37.583 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node9 TASK [Get stratis pool information] ******************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Saturday 21 March 2026 19:19:27 -0400 (0:00:00.533) 0:22:38.117 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print script output] ***************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Saturday 21 March 2026 19:19:27 -0400 (0:00:00.238) 0:22:38.355 ******** skipping: [managed-node9] => {} TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:19 Saturday 21 March 2026 19:19:27 -0400 (0:00:00.241) 0:22:38.597 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:23 Saturday 21 March 2026 19:19:27 -0400 (0:00:00.195) 0:22:38.792 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:30 Saturday 21 March 2026 19:19:28 -0400 (0:00:00.164) 0:22:38.957 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:39 Saturday 21 March 2026 19:19:28 -0400 (0:00:00.252) 0:22:39.210 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:49 Saturday 21 March 2026 19:19:28 -0400 (0:00:00.195) 0:22:39.406 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:106 Saturday 21 March 2026 19:19:28 -0400 (0:00:00.144) 0:22:39.550 ******** ok: [managed-node9] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Saturday 21 March 2026 19:19:28 -0400 (0:00:00.148) 0:22:39.699 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node9 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Saturday 21 March 2026 19:19:29 -0400 (0:00:00.331) 0:22:40.031 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Saturday 21 March 2026 19:19:29 -0400 (0:00:00.240) 0:22:40.272 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node9 included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node9 included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node9 included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node9 included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node9 included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node9 included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node9 included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node9 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Saturday 21 March 2026 19:19:30 -0400 (0:00:01.055) 0:22:41.327 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-0cea85bc-813c-428a-b38b-dec5e406e24d" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Saturday 21 March 2026 19:19:30 -0400 (0:00:00.329) 0:22:41.656 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Saturday 21 March 2026 19:19:31 -0400 (0:00:00.285) 0:22:41.942 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Saturday 21 March 2026 19:19:31 -0400 (0:00:00.251) 0:22:42.193 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Saturday 21 March 2026 19:19:31 -0400 (0:00:00.234) 0:22:42.428 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Saturday 21 March 2026 19:19:31 -0400 (0:00:00.307) 0:22:42.735 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Saturday 21 March 2026 19:19:32 -0400 (0:00:00.261) 0:22:42.996 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Saturday 21 March 2026 19:19:32 -0400 (0:00:00.204) 0:22:43.201 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Saturday 21 March 2026 19:19:32 -0400 (0:00:00.220) 0:22:43.422 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Saturday 21 March 2026 19:19:32 -0400 (0:00:00.187) 0:22:43.609 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Saturday 21 March 2026 19:19:33 -0400 (0:00:00.259) 0:22:43.869 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Saturday 21 March 2026 19:19:33 -0400 (0:00:00.281) 0:22:44.150 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-0cea85bc-813c-428a-b38b-dec5e406e24d " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Saturday 21 March 2026 19:19:33 -0400 (0:00:00.523) 0:22:44.674 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Saturday 21 March 2026 19:19:34 -0400 (0:00:00.360) 0:22:45.035 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Saturday 21 March 2026 19:19:34 -0400 (0:00:00.257) 0:22:45.292 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Saturday 21 March 2026 19:19:34 -0400 (0:00:00.274) 0:22:45.567 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Saturday 21 March 2026 19:19:35 -0400 (0:00:00.307) 0:22:45.874 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Saturday 21 March 2026 19:19:35 -0400 (0:00:00.173) 0:22:46.047 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Saturday 21 March 2026 19:19:35 -0400 (0:00:00.242) 0:22:46.290 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Saturday 21 March 2026 19:19:35 -0400 (0:00:00.261) 0:22:46.551 ******** ok: [managed-node9] => { "changed": false, "stat": { "atime": 1774135123.866464, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1774135123.866464, "dev": 6, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 298511, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1774135123.866464, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Saturday 21 March 2026 19:19:37 -0400 (0:00:01.609) 0:22:48.160 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Saturday 21 March 2026 19:19:37 -0400 (0:00:00.224) 0:22:48.385 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Saturday 21 March 2026 19:19:37 -0400 (0:00:00.249) 0:22:48.635 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Saturday 21 March 2026 19:19:37 -0400 (0:00:00.195) 0:22:48.831 ******** ok: [managed-node9] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Saturday 21 March 2026 19:19:38 -0400 (0:00:00.360) 0:22:49.192 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Saturday 21 March 2026 19:19:38 -0400 (0:00:00.251) 0:22:49.443 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Saturday 21 March 2026 19:19:38 -0400 (0:00:00.196) 0:22:49.640 ******** ok: [managed-node9] => { "changed": false, "stat": { "atime": 1774135124.0134647, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1774135124.0134647, "dev": 6, "device_type": 64769, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 315341, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1774135124.0134647, "nlink": 1, "path": "/dev/mapper/luks-0cea85bc-813c-428a-b38b-dec5e406e24d", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Saturday 21 March 2026 19:19:40 -0400 (0:00:01.556) 0:22:51.196 ******** ok: [managed-node9] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Saturday 21 March 2026 19:19:44 -0400 (0:00:04.062) 0:22:55.258 ******** ok: [managed-node9] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/mapper/foo-test1" ], "delta": "0:00:00.010299", "end": "2026-03-21 19:19:45.352310", "rc": 0, "start": "2026-03-21 19:19:45.342011" } STDOUT: LUKS header information Version: 2 Epoch: 3 Metadata area: 16384 [bytes] Keyslots area: 16744448 [bytes] UUID: 0cea85bc-813c-428a-b38b-dec5e406e24d Label: (no label) Subsystem: (no subsystem) Flags: (no flags) Data segments: 0: crypt offset: 16777216 [bytes] length: (whole device) cipher: aes-xts-plain64 sector: 512 [bytes] Keyslots: 0: luks2 Key: 512 bits Priority: normal Cipher: aes-xts-plain64 Cipher key: 512 bits PBKDF: argon2i Time cost: 4 Memory: 919414 Threads: 2 Salt: 10 56 fa 85 fc 72 57 92 e0 18 eb 05 35 43 57 fa 27 7a 91 45 e1 6a 25 50 9e bb a6 7b d9 61 3d cb AF stripes: 4000 AF hash: sha256 Area offset:32768 [bytes] Area length:258048 [bytes] Digest ID: 0 Tokens: Digests: 0: pbkdf2 Hash: sha256 Iterations: 120029 Salt: 99 79 e4 fa b9 43 4b 19 8a 37 c1 d6 12 0c d7 22 6f f6 c6 c5 50 67 3d 50 fb c8 57 c0 06 24 8a ee Digest: 71 57 bc a4 f3 43 96 69 3f 5c 02 3b 3b e9 43 77 02 a8 00 50 b6 98 45 61 cf db 1a 32 4e de 99 69 TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Saturday 21 March 2026 19:19:45 -0400 (0:00:01.224) 0:22:56.483 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Saturday 21 March 2026 19:19:45 -0400 (0:00:00.275) 0:22:56.759 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Saturday 21 March 2026 19:19:46 -0400 (0:00:00.196) 0:22:56.956 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Saturday 21 March 2026 19:19:46 -0400 (0:00:00.206) 0:22:57.162 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Saturday 21 March 2026 19:19:46 -0400 (0:00:00.188) 0:22:57.351 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Saturday 21 March 2026 19:19:46 -0400 (0:00:00.220) 0:22:57.572 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Saturday 21 March 2026 19:19:46 -0400 (0:00:00.221) 0:22:57.793 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Saturday 21 March 2026 19:19:47 -0400 (0:00:00.210) 0:22:58.004 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-0cea85bc-813c-428a-b38b-dec5e406e24d /dev/mapper/foo-test1 -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Saturday 21 March 2026 19:19:47 -0400 (0:00:00.303) 0:22:58.308 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Saturday 21 March 2026 19:19:47 -0400 (0:00:00.247) 0:22:58.556 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Saturday 21 March 2026 19:19:47 -0400 (0:00:00.234) 0:22:58.790 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Saturday 21 March 2026 19:19:48 -0400 (0:00:00.412) 0:22:59.203 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Saturday 21 March 2026 19:19:48 -0400 (0:00:00.220) 0:22:59.424 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Saturday 21 March 2026 19:19:48 -0400 (0:00:00.196) 0:22:59.620 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Saturday 21 March 2026 19:19:48 -0400 (0:00:00.213) 0:22:59.834 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Saturday 21 March 2026 19:19:49 -0400 (0:00:00.245) 0:23:00.079 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Saturday 21 March 2026 19:19:49 -0400 (0:00:00.225) 0:23:00.304 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Saturday 21 March 2026 19:19:49 -0400 (0:00:00.257) 0:23:00.561 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Saturday 21 March 2026 19:19:49 -0400 (0:00:00.232) 0:23:00.794 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Saturday 21 March 2026 19:19:50 -0400 (0:00:00.095) 0:23:00.890 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Saturday 21 March 2026 19:19:50 -0400 (0:00:00.131) 0:23:01.021 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Saturday 21 March 2026 19:19:50 -0400 (0:00:00.187) 0:23:01.209 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Saturday 21 March 2026 19:19:50 -0400 (0:00:00.127) 0:23:01.337 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Saturday 21 March 2026 19:19:50 -0400 (0:00:00.147) 0:23:01.485 ******** ok: [managed-node9] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Saturday 21 March 2026 19:19:52 -0400 (0:00:01.609) 0:23:03.095 ******** ok: [managed-node9] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Saturday 21 March 2026 19:19:53 -0400 (0:00:01.529) 0:23:04.625 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [Show expected size] ****************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Saturday 21 March 2026 19:19:53 -0400 (0:00:00.172) 0:23:04.798 ******** ok: [managed-node9] => { "storage_test_expected_size": "4294967296" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Saturday 21 March 2026 19:19:54 -0400 (0:00:00.130) 0:23:04.928 ******** ok: [managed-node9] => { "bytes": 10726680821, "changed": false, "lvm": "9g", "parted": "9GiB", "size": "9 GiB" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Saturday 21 March 2026 19:19:55 -0400 (0:00:00.964) 0:23:05.892 ******** skipping: [managed-node9] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Saturday 21 March 2026 19:19:55 -0400 (0:00:00.104) 0:23:05.997 ******** skipping: [managed-node9] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Saturday 21 March 2026 19:19:55 -0400 (0:00:00.300) 0:23:06.297 ******** skipping: [managed-node9] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Saturday 21 March 2026 19:19:55 -0400 (0:00:00.277) 0:23:06.574 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Saturday 21 March 2026 19:19:56 -0400 (0:00:00.275) 0:23:06.850 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Saturday 21 March 2026 19:19:56 -0400 (0:00:00.276) 0:23:07.127 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Saturday 21 March 2026 19:19:56 -0400 (0:00:00.169) 0:23:07.296 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Saturday 21 March 2026 19:19:56 -0400 (0:00:00.178) 0:23:07.475 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Saturday 21 March 2026 19:19:56 -0400 (0:00:00.110) 0:23:07.585 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Saturday 21 March 2026 19:19:56 -0400 (0:00:00.200) 0:23:07.786 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Saturday 21 March 2026 19:19:57 -0400 (0:00:00.223) 0:23:08.010 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Saturday 21 March 2026 19:19:57 -0400 (0:00:00.289) 0:23:08.299 ******** skipping: [managed-node9] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Saturday 21 March 2026 19:19:57 -0400 (0:00:00.152) 0:23:08.452 ******** skipping: [managed-node9] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Saturday 21 March 2026 19:19:57 -0400 (0:00:00.191) 0:23:08.643 ******** skipping: [managed-node9] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Saturday 21 March 2026 19:19:57 -0400 (0:00:00.141) 0:23:08.785 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Saturday 21 March 2026 19:19:58 -0400 (0:00:00.140) 0:23:08.925 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Saturday 21 March 2026 19:19:58 -0400 (0:00:00.162) 0:23:09.088 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Saturday 21 March 2026 19:19:58 -0400 (0:00:00.147) 0:23:09.235 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Saturday 21 March 2026 19:19:58 -0400 (0:00:00.146) 0:23:09.382 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Saturday 21 March 2026 19:19:58 -0400 (0:00:00.192) 0:23:09.574 ******** ok: [managed-node9] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Saturday 21 March 2026 19:19:58 -0400 (0:00:00.078) 0:23:09.653 ******** ok: [managed-node9] => { "storage_test_expected_size": "4294967296" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Saturday 21 March 2026 19:19:58 -0400 (0:00:00.109) 0:23:09.762 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Saturday 21 March 2026 19:19:59 -0400 (0:00:00.191) 0:23:09.954 ******** ok: [managed-node9] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.023935", "end": "2026-03-21 19:19:59.975699", "rc": 0, "start": "2026-03-21 19:19:59.951764" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Saturday 21 March 2026 19:20:00 -0400 (0:00:00.987) 0:23:10.942 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [Check segment type] ****************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Saturday 21 March 2026 19:20:00 -0400 (0:00:00.053) 0:23:10.995 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Saturday 21 March 2026 19:20:00 -0400 (0:00:00.120) 0:23:11.116 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Saturday 21 March 2026 19:20:00 -0400 (0:00:00.108) 0:23:11.224 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Saturday 21 March 2026 19:20:00 -0400 (0:00:00.117) 0:23:11.342 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Saturday 21 March 2026 19:20:00 -0400 (0:00:00.130) 0:23:11.473 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Saturday 21 March 2026 19:20:00 -0400 (0:00:00.184) 0:23:11.658 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Saturday 21 March 2026 19:20:00 -0400 (0:00:00.094) 0:23:11.752 ******** TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Saturday 21 March 2026 19:20:01 -0400 (0:00:00.130) 0:23:11.882 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Clean up] **************************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:502 Saturday 21 March 2026 19:20:01 -0400 (0:00:00.142) 0:23:12.025 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node9 META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:24 Saturday 21 March 2026 19:20:01 -0400 (0:00:00.447) 0:23:12.473 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:34 Saturday 21 March 2026 19:20:01 -0400 (0:00:00.232) 0:23:12.705 ******** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Saturday 21 March 2026 19:20:02 -0400 (0:00:00.256) 0:23:12.961 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Saturday 21 March 2026 19:20:02 -0400 (0:00:00.194) 0:23:13.156 ******** ok: [managed-node9] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Saturday 21 March 2026 19:20:04 -0400 (0:00:02.385) 0:23:15.541 ******** skipping: [managed-node9] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node9] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node9] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } ok: [managed-node9] => (item=CentOS_8.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_8.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_8.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Saturday 21 March 2026 19:20:05 -0400 (0:00:00.489) 0:23:16.031 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Saturday 21 March 2026 19:20:05 -0400 (0:00:00.286) 0:23:16.317 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Saturday 21 March 2026 19:20:05 -0400 (0:00:00.306) 0:23:16.623 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Saturday 21 March 2026 19:20:05 -0400 (0:00:00.204) 0:23:16.828 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Saturday 21 March 2026 19:20:06 -0400 (0:00:00.155) 0:23:16.984 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Saturday 21 March 2026 19:20:06 -0400 (0:00:00.415) 0:23:17.399 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Saturday 21 March 2026 19:20:06 -0400 (0:00:00.118) 0:23:17.517 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Saturday 21 March 2026 19:20:06 -0400 (0:00:00.078) 0:23:17.596 ******** ok: [managed-node9] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kmod-kvdo libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python3-blivet stratis-cli stratisd vdo xfsprogs TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Saturday 21 March 2026 19:20:10 -0400 (0:00:03.911) 0:23:21.508 ******** ok: [managed-node9] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Saturday 21 March 2026 19:20:10 -0400 (0:00:00.208) 0:23:21.716 ******** ok: [managed-node9] => { "storage_volumes | d([])": [ { "disks": [ "sda" ], "name": "foo", "state": "absent", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Saturday 21 March 2026 19:20:11 -0400 (0:00:00.158) 0:23:21.874 ******** ok: [managed-node9] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Saturday 21 March 2026 19:20:16 -0400 (0:00:05.321) 0:23:27.195 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Saturday 21 March 2026 19:20:16 -0400 (0:00:00.365) 0:23:27.561 ******** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Saturday 21 March 2026 19:20:16 -0400 (0:00:00.149) 0:23:27.711 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Saturday 21 March 2026 19:20:17 -0400 (0:00:00.257) 0:23:27.968 ******** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Saturday 21 March 2026 19:20:17 -0400 (0:00:00.249) 0:23:28.218 ******** ok: [managed-node9] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kpartx TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Saturday 21 March 2026 19:20:21 -0400 (0:00:03.845) 0:23:32.064 ******** ok: [managed-node9] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.portable1.service": { "name": "dbus-org.freedesktop.portable1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "import-state.service": { "name": "import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "loadmodules.service": { "name": "loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-convert.service": { "name": "nfs-convert.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "static" }, "plymouth-switch-root-initramfs.service": { "name": "plymouth-switch-root-initramfs.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "active", "status": "enabled" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-portabled.service": { "name": "systemd-portabled.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-resolved.service": { "name": "systemd-resolved.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "masked" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "tcsd.service": { "name": "tcsd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "timedatex.service": { "name": "timedatex.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "unbound-anchor.service": { "name": "unbound-anchor.service", "source": "systemd", "state": "stopped", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "unknown" }, "vdo-start-by-dev@.service": { "name": "vdo-start-by-dev@.service", "source": "systemd", "state": "unknown", "status": "static" }, "vdo.service": { "name": "vdo.service", "source": "systemd", "state": "stopped", "status": "enabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Saturday 21 March 2026 19:20:24 -0400 (0:00:03.695) 0:23:35.759 ******** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Saturday 21 March 2026 19:20:25 -0400 (0:00:00.404) 0:23:36.163 ******** changed: [managed-node9] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-0cea85bc-813c-428a-b38b-dec5e406e24d", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-0cea85bc-813c-428a-b38b-dec5e406e24d", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "destroy device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "lvmpv" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-0cea85bc-813c-428a-b38b-dec5e406e24d", "password": "-", "state": "absent" } ], "leaves": [ "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-0cea85bc-813c-428a-b38b-dec5e406e24d", "state": "absent" } ], "packages": [ "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_mount_id": "UUID=FV4NCV-y18g-hE3Q-w15a-CQO8-jg2d-pQkOwv", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "lvmpv", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "absent", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:103 Saturday 21 March 2026 19:20:31 -0400 (0:00:06.477) 0:23:42.641 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:110 Saturday 21 March 2026 19:20:32 -0400 (0:00:00.252) 0:23:42.893 ******** ok: [managed-node9] => { "changed": false, "stat": { "atime": 1774135133.1115165, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "f5280221d0ead55d78a5e45465b42a7b1176f952", "ctime": 1774135133.1085165, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 364904581, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1774135133.1085165, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1425, "uid": 0, "version": "133446622", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Saturday 21 March 2026 19:20:33 -0400 (0:00:01.476) 0:23:44.370 ******** ok: [managed-node9] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Saturday 21 March 2026 19:20:35 -0400 (0:00:01.789) 0:23:46.160 ******** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:139 Saturday 21 March 2026 19:20:35 -0400 (0:00:00.375) 0:23:46.535 ******** ok: [managed-node9] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-0cea85bc-813c-428a-b38b-dec5e406e24d", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-0cea85bc-813c-428a-b38b-dec5e406e24d", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "destroy device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "lvmpv" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-0cea85bc-813c-428a-b38b-dec5e406e24d", "password": "-", "state": "absent" } ], "failed": false, "leaves": [ "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-0cea85bc-813c-428a-b38b-dec5e406e24d", "state": "absent" } ], "packages": [ "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_mount_id": "UUID=FV4NCV-y18g-hE3Q-w15a-CQO8-jg2d-pQkOwv", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "lvmpv", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "absent", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Saturday 21 March 2026 19:20:35 -0400 (0:00:00.246) 0:23:46.781 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:152 Saturday 21 March 2026 19:20:36 -0400 (0:00:00.252) 0:23:47.034 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/sda", "_mount_id": "UUID=FV4NCV-y18g-hE3Q-w15a-CQO8-jg2d-pQkOwv", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "lvmpv", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "absent", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:168 Saturday 21 March 2026 19:20:36 -0400 (0:00:00.300) 0:23:47.334 ******** changed: [managed-node9] => (item={'src': '/dev/mapper/luks-0cea85bc-813c-428a-b38b-dec5e406e24d', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-0cea85bc-813c-428a-b38b-dec5e406e24d", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-0cea85bc-813c-428a-b38b-dec5e406e24d" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:179 Saturday 21 March 2026 19:20:37 -0400 (0:00:01.363) 0:23:48.698 ******** ok: [managed-node9] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:184 Saturday 21 March 2026 19:20:39 -0400 (0:00:01.473) 0:23:50.171 ******** TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Saturday 21 March 2026 19:20:39 -0400 (0:00:00.212) 0:23:50.384 ******** TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:207 Saturday 21 March 2026 19:20:39 -0400 (0:00:00.172) 0:23:50.557 ******** ok: [managed-node9] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:215 Saturday 21 March 2026 19:20:41 -0400 (0:00:01.999) 0:23:52.557 ******** ok: [managed-node9] => { "changed": false, "stat": { "atime": 1774135145.0675848, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "d68c2f82c1d634f94b5d3a9caf59da0fa93b634e", "ctime": 1774135137.8855438, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 515899526, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1774135137.8845437, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 66, "uid": 0, "version": "2693242790", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:220 Saturday 21 March 2026 19:20:43 -0400 (0:00:01.425) 0:23:53.982 ******** changed: [managed-node9] => (item={'backing_device': '/dev/mapper/foo-test1', 'name': 'luks-0cea85bc-813c-428a-b38b-dec5e406e24d', 'password': '-', 'state': 'absent'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/mapper/foo-test1", "name": "luks-0cea85bc-813c-428a-b38b-dec5e406e24d", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:242 Saturday 21 March 2026 19:20:44 -0400 (0:00:01.600) 0:23:55.583 ******** ok: [managed-node9] TASK [Verify role results - 11] ************************************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:511 Saturday 21 March 2026 19:20:46 -0400 (0:00:02.009) 0:23:57.592 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node9 TASK [Print out pool information] ********************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Saturday 21 March 2026 19:20:47 -0400 (0:00:00.561) 0:23:58.153 ******** skipping: [managed-node9] => {} TASK [Print out volume information] ******************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Saturday 21 March 2026 19:20:47 -0400 (0:00:00.174) 0:23:58.328 ******** ok: [managed-node9] => { "_storage_volumes_list": [ { "_device": "/dev/sda", "_mount_id": "UUID=FV4NCV-y18g-hE3Q-w15a-CQO8-jg2d-pQkOwv", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "lvmpv", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "absent", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Saturday 21 March 2026 19:20:47 -0400 (0:00:00.377) 0:23:58.705 ******** ok: [managed-node9] => { "changed": false, "info": { "/dev/sda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fe591198-9082-4b15-9b62-e83518524cd2" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Saturday 21 March 2026 19:20:49 -0400 (0:00:01.386) 0:24:00.092 ******** ok: [managed-node9] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002623", "end": "2026-03-21 19:20:50.434538", "rc": 0, "start": "2026-03-21 19:20:50.431915" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Wed May 29 07:43:06 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fe591198-9082-4b15-9b62-e83518524cd2 / xfs defaults 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Saturday 21 March 2026 19:20:50 -0400 (0:00:01.416) 0:24:01.508 ******** ok: [managed-node9] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002697", "end": "2026-03-21 19:20:51.847632", "failed_when_result": false, "rc": 0, "start": "2026-03-21 19:20:51.844935" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Saturday 21 March 2026 19:20:52 -0400 (0:00:01.340) 0:24:02.849 ******** TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Saturday 21 March 2026 19:20:52 -0400 (0:00:00.068) 0:24:02.917 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node9 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Saturday 21 March 2026 19:20:52 -0400 (0:00:00.130) 0:24:03.048 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_volume_present": false, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Saturday 21 March 2026 19:20:52 -0400 (0:00:00.094) 0:24:03.142 ******** included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node9 included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node9 included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node9 included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node9 included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node9 included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node9 included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node9 included: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node9 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Saturday 21 March 2026 19:20:52 -0400 (0:00:00.598) 0:24:03.741 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_device_path": "/dev/sda" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Saturday 21 March 2026 19:20:53 -0400 (0:00:00.287) 0:24:04.028 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Saturday 21 March 2026 19:20:53 -0400 (0:00:00.214) 0:24:04.243 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Saturday 21 March 2026 19:20:53 -0400 (0:00:00.212) 0:24:04.455 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Saturday 21 March 2026 19:20:53 -0400 (0:00:00.146) 0:24:04.602 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Saturday 21 March 2026 19:20:53 -0400 (0:00:00.226) 0:24:04.828 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Saturday 21 March 2026 19:20:54 -0400 (0:00:00.145) 0:24:04.973 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Saturday 21 March 2026 19:20:54 -0400 (0:00:00.194) 0:24:05.168 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Saturday 21 March 2026 19:20:54 -0400 (0:00:00.235) 0:24:05.404 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Saturday 21 March 2026 19:20:54 -0400 (0:00:00.283) 0:24:05.688 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Saturday 21 March 2026 19:20:55 -0400 (0:00:00.239) 0:24:05.927 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Saturday 21 March 2026 19:20:55 -0400 (0:00:00.382) 0:24:06.309 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "0", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Saturday 21 March 2026 19:20:55 -0400 (0:00:00.387) 0:24:06.697 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Saturday 21 March 2026 19:20:56 -0400 (0:00:00.251) 0:24:06.948 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Saturday 21 March 2026 19:20:56 -0400 (0:00:00.248) 0:24:07.197 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Saturday 21 March 2026 19:20:56 -0400 (0:00:00.210) 0:24:07.428 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Saturday 21 March 2026 19:20:56 -0400 (0:00:00.226) 0:24:07.655 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Saturday 21 March 2026 19:20:57 -0400 (0:00:00.233) 0:24:07.889 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fs label] ********************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Saturday 21 March 2026 19:20:57 -0400 (0:00:00.242) 0:24:08.131 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [See whether the device node is present] ********************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Saturday 21 March 2026 19:20:57 -0400 (0:00:00.331) 0:24:08.463 ******** ok: [managed-node9] => { "changed": false, "stat": { "atime": 1774135231.406077, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1774135231.406077, "dev": 6, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 36050, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1774135231.406077, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Saturday 21 March 2026 19:20:59 -0400 (0:00:01.646) 0:24:10.109 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Saturday 21 March 2026 19:20:59 -0400 (0:00:00.296) 0:24:10.406 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Saturday 21 March 2026 19:20:59 -0400 (0:00:00.174) 0:24:10.580 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Saturday 21 March 2026 19:20:59 -0400 (0:00:00.080) 0:24:10.660 ******** ok: [managed-node9] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Saturday 21 March 2026 19:21:00 -0400 (0:00:00.241) 0:24:10.902 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Saturday 21 March 2026 19:21:00 -0400 (0:00:00.268) 0:24:11.171 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Saturday 21 March 2026 19:21:00 -0400 (0:00:00.163) 0:24:11.335 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Saturday 21 March 2026 19:21:00 -0400 (0:00:00.288) 0:24:11.623 ******** ok: [managed-node9] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Saturday 21 March 2026 19:21:05 -0400 (0:00:04.323) 0:24:15.946 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Saturday 21 March 2026 19:21:05 -0400 (0:00:00.367) 0:24:16.314 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Saturday 21 March 2026 19:21:05 -0400 (0:00:00.215) 0:24:16.529 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Saturday 21 March 2026 19:21:05 -0400 (0:00:00.210) 0:24:16.739 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Saturday 21 March 2026 19:21:06 -0400 (0:00:00.294) 0:24:17.034 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Saturday 21 March 2026 19:21:06 -0400 (0:00:00.257) 0:24:17.291 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Saturday 21 March 2026 19:21:06 -0400 (0:00:00.191) 0:24:17.483 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Saturday 21 March 2026 19:21:06 -0400 (0:00:00.217) 0:24:17.700 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Saturday 21 March 2026 19:21:07 -0400 (0:00:00.149) 0:24:17.849 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Saturday 21 March 2026 19:21:07 -0400 (0:00:00.384) 0:24:18.234 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Saturday 21 March 2026 19:21:07 -0400 (0:00:00.236) 0:24:18.470 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Saturday 21 March 2026 19:21:07 -0400 (0:00:00.245) 0:24:18.733 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Saturday 21 March 2026 19:21:08 -0400 (0:00:00.213) 0:24:18.947 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Saturday 21 March 2026 19:21:08 -0400 (0:00:00.280) 0:24:19.228 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Saturday 21 March 2026 19:21:08 -0400 (0:00:00.217) 0:24:19.445 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Saturday 21 March 2026 19:21:08 -0400 (0:00:00.235) 0:24:19.681 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Saturday 21 March 2026 19:21:09 -0400 (0:00:00.233) 0:24:19.915 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Saturday 21 March 2026 19:21:09 -0400 (0:00:00.343) 0:24:20.259 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Saturday 21 March 2026 19:21:09 -0400 (0:00:00.219) 0:24:20.478 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Saturday 21 March 2026 19:21:09 -0400 (0:00:00.134) 0:24:20.612 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Saturday 21 March 2026 19:21:10 -0400 (0:00:00.235) 0:24:20.847 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Saturday 21 March 2026 19:21:10 -0400 (0:00:00.227) 0:24:21.075 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Saturday 21 March 2026 19:21:10 -0400 (0:00:00.196) 0:24:21.272 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Saturday 21 March 2026 19:21:10 -0400 (0:00:00.201) 0:24:21.473 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Saturday 21 March 2026 19:21:10 -0400 (0:00:00.127) 0:24:21.601 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Saturday 21 March 2026 19:21:10 -0400 (0:00:00.183) 0:24:21.784 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Saturday 21 March 2026 19:21:11 -0400 (0:00:00.187) 0:24:21.971 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Saturday 21 March 2026 19:21:11 -0400 (0:00:00.141) 0:24:22.113 ******** ok: [managed-node9] => { "storage_test_expected_size": "4294967296" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Saturday 21 March 2026 19:21:11 -0400 (0:00:00.159) 0:24:22.273 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Saturday 21 March 2026 19:21:11 -0400 (0:00:00.192) 0:24:22.465 ******** skipping: [managed-node9] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Saturday 21 March 2026 19:21:11 -0400 (0:00:00.275) 0:24:22.741 ******** skipping: [managed-node9] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Saturday 21 March 2026 19:21:12 -0400 (0:00:00.185) 0:24:22.926 ******** skipping: [managed-node9] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Saturday 21 March 2026 19:21:12 -0400 (0:00:00.119) 0:24:23.064 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Saturday 21 March 2026 19:21:12 -0400 (0:00:00.153) 0:24:23.217 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Saturday 21 March 2026 19:21:12 -0400 (0:00:00.234) 0:24:23.452 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Saturday 21 March 2026 19:21:12 -0400 (0:00:00.259) 0:24:23.711 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Saturday 21 March 2026 19:21:12 -0400 (0:00:00.119) 0:24:23.830 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Saturday 21 March 2026 19:21:13 -0400 (0:00:00.199) 0:24:24.030 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Saturday 21 March 2026 19:21:13 -0400 (0:00:00.297) 0:24:24.327 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Saturday 21 March 2026 19:21:13 -0400 (0:00:00.255) 0:24:24.582 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Saturday 21 March 2026 19:21:14 -0400 (0:00:00.367) 0:24:24.950 ******** skipping: [managed-node9] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Saturday 21 March 2026 19:21:14 -0400 (0:00:00.219) 0:24:25.170 ******** skipping: [managed-node9] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Saturday 21 March 2026 19:21:14 -0400 (0:00:00.208) 0:24:25.378 ******** skipping: [managed-node9] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Saturday 21 March 2026 19:21:14 -0400 (0:00:00.234) 0:24:25.612 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Saturday 21 March 2026 19:21:14 -0400 (0:00:00.218) 0:24:25.831 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Saturday 21 March 2026 19:21:15 -0400 (0:00:00.272) 0:24:26.103 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Saturday 21 March 2026 19:21:15 -0400 (0:00:00.258) 0:24:26.362 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Saturday 21 March 2026 19:21:15 -0400 (0:00:00.240) 0:24:26.603 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Saturday 21 March 2026 19:21:15 -0400 (0:00:00.178) 0:24:26.782 ******** ok: [managed-node9] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Saturday 21 March 2026 19:21:16 -0400 (0:00:00.333) 0:24:27.115 ******** ok: [managed-node9] => { "storage_test_expected_size": "4294967296" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Saturday 21 March 2026 19:21:16 -0400 (0:00:00.226) 0:24:27.342 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Saturday 21 March 2026 19:21:16 -0400 (0:00:00.257) 0:24:27.599 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Saturday 21 March 2026 19:21:17 -0400 (0:00:00.246) 0:24:27.846 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Saturday 21 March 2026 19:21:17 -0400 (0:00:00.231) 0:24:28.077 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Saturday 21 March 2026 19:21:17 -0400 (0:00:00.217) 0:24:28.295 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Saturday 21 March 2026 19:21:17 -0400 (0:00:00.304) 0:24:28.599 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Saturday 21 March 2026 19:21:18 -0400 (0:00:00.277) 0:24:28.877 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Saturday 21 March 2026 19:21:18 -0400 (0:00:00.309) 0:24:29.186 ******** skipping: [managed-node9] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Saturday 21 March 2026 19:21:18 -0400 (0:00:00.205) 0:24:29.392 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Saturday 21 March 2026 19:21:18 -0400 (0:00:00.178) 0:24:29.571 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* managed-node9 : ok=1245 changed=60 unreachable=0 failed=9 skipped=1115 rescued=9 ignored=0 SYSTEM ROLES ERRORS BEGIN v1 [ { "ansible_version": "2.9.27", "end_time": "2026-03-21T22:58:04.411662+00:00Z", "host": "managed-node9", "message": "encrypted volume 'foo' missing key/password", "start_time": "2026-03-21T22:57:59.429641+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88" }, { "ansible_version": "2.9.27", "end_time": "2026-03-21T22:58:04.599395+00:00Z", "host": "managed-node9", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [], "safe_mode": true, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } }, "leaves": [], "mounts": [], "msg": "encrypted volume 'foo' missing key/password", "packages": [], "pools": [], "volumes": [] }, "start_time": "2026-03-21T22:58:04.419986+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129" }, { "ansible_version": "2.9.27", "end_time": "2026-03-21T23:00:07.886973+00:00Z", "host": "managed-node9", "message": "cannot remove existing formatting on device 'luks-650cacf1-b078-496c-9d32-c0b5fdf3584c' in safe mode due to encryption removal", "start_time": "2026-03-21T23:00:02.931980+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88" }, { "ansible_version": "2.9.27", "end_time": "2026-03-21T23:00:08.134006+00:00Z", "host": "managed-node9", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [], "safe_mode": true, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10720641024, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } }, "leaves": [], "mounts": [], "msg": "cannot remove existing formatting on device 'luks-650cacf1-b078-496c-9d32-c0b5fdf3584c' in safe mode due to encryption removal", "packages": [], "pools": [], "volumes": [] }, "start_time": "2026-03-21T23:00:07.894020+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129" }, { "ansible_version": "2.9.27", "end_time": "2026-03-21T23:02:00.148232+00:00Z", "host": "managed-node9", "message": "cannot remove existing formatting on device 'sda' in safe mode due to adding encryption", "start_time": "2026-03-21T23:01:55.423622+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88" }, { "ansible_version": "2.9.27", "end_time": "2026-03-21T23:02:00.230273+00:00Z", "host": "managed-node9", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [], "safe_mode": true, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } }, "leaves": [], "mounts": [], "msg": "cannot remove existing formatting on device 'sda' in safe mode due to adding encryption", "packages": [], "pools": [], "volumes": [] }, "start_time": "2026-03-21T23:02:00.155480+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129" }, { "ansible_version": "2.9.27", "end_time": "2026-03-21T23:04:00.241907+00:00Z", "host": "managed-node9", "message": "encrypted volume 'test1' missing key/password", "start_time": "2026-03-21T23:03:54.876154+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88" }, { "ansible_version": "2.9.27", "end_time": "2026-03-21T23:04:00.593630+00:00Z", "host": "managed-node9", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "safe_mode": false, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [] } }, "leaves": [], "mounts": [], "msg": "encrypted volume 'test1' missing key/password", "packages": [], "pools": [], "volumes": [] }, "start_time": "2026-03-21T23:04:00.302115+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129" }, { "ansible_version": "2.9.27", "end_time": "2026-03-21T23:06:24.062513+00:00Z", "host": "managed-node9", "message": "cannot remove existing formatting on device 'luks-c9479194-c226-4bd0-b788-f506f21ba738' in safe mode due to encryption removal", "start_time": "2026-03-21T23:06:18.724178+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88" }, { "ansible_version": "2.9.27", "end_time": "2026-03-21T23:06:24.324684+00:00Z", "host": "managed-node9", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "safe_mode": true, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [] } }, "leaves": [], "mounts": [], "msg": "cannot remove existing formatting on device 'luks-c9479194-c226-4bd0-b788-f506f21ba738' in safe mode due to encryption removal", "packages": [], "pools": [], "volumes": [] }, "start_time": "2026-03-21T23:06:24.083623+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129" }, { "ansible_version": "2.9.27", "end_time": "2026-03-21T23:08:44.701944+00:00Z", "host": "managed-node9", "message": "cannot remove existing formatting on device 'sda1' in safe mode due to adding encryption", "start_time": "2026-03-21T23:08:39.320986+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88" }, { "ansible_version": "2.9.27", "end_time": "2026-03-21T23:08:44.926459+00:00Z", "host": "managed-node9", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "safe_mode": true, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [] } }, "leaves": [], "mounts": [], "msg": "cannot remove existing formatting on device 'sda1' in safe mode due to adding encryption", "packages": [], "pools": [], "volumes": [] }, "start_time": "2026-03-21T23:08:44.721096+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129" }, { "ansible_version": "2.9.27", "end_time": "2026-03-21T23:11:10.166008+00:00Z", "host": "managed-node9", "message": "encrypted volume 'test1' missing key/password", "start_time": "2026-03-21T23:11:05.407308+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88" }, { "ansible_version": "2.9.27", "end_time": "2026-03-21T23:11:10.403471+00:00Z", "host": "managed-node9", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "safe_mode": false, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [] } }, "leaves": [], "mounts": [], "msg": "encrypted volume 'test1' missing key/password", "packages": [], "pools": [], "volumes": [] }, "start_time": "2026-03-21T23:11:10.188493+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129" }, { "ansible_version": "2.9.27", "end_time": "2026-03-21T23:15:36.970954+00:00Z", "host": "managed-node9", "message": "cannot remove existing formatting on device 'luks-f3a44b09-71d6-43a0-aacd-e66622f7a65f' in safe mode due to encryption removal", "start_time": "2026-03-21T23:15:31.833647+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88" }, { "ansible_version": "2.9.27", "end_time": "2026-03-21T23:15:37.264102+00:00Z", "host": "managed-node9", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "safe_mode": true, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [] } }, "leaves": [], "mounts": [], "msg": "cannot remove existing formatting on device 'luks-f3a44b09-71d6-43a0-aacd-e66622f7a65f' in safe mode due to encryption removal", "packages": [], "pools": [], "volumes": [] }, "start_time": "2026-03-21T23:15:36.977570+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129" }, { "ansible_version": "2.9.27", "end_time": "2026-03-21T23:18:07.123538+00:00Z", "host": "managed-node9", "message": "cannot remove existing formatting on device 'test1' in safe mode due to adding encryption", "start_time": "2026-03-21T23:18:02.530157+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88" }, { "ansible_version": "2.9.27", "end_time": "2026-03-21T23:18:07.352394+00:00Z", "host": "managed-node9", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "safe_mode": true, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [] } }, "leaves": [], "mounts": [], "msg": "cannot remove existing formatting on device 'test1' in safe mode due to adding encryption", "packages": [], "pools": [], "volumes": [] }, "start_time": "2026-03-21T23:18:07.163908+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129" } ] SYSTEM ROLES ERRORS END v1 TASKS RECAP ******************************************************************** Saturday 21 March 2026 19:21:18 -0400 (0:00:00.252) 0:24:29.823 ******** =============================================================================== fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 14.23s /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 13.62s /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 13.43s /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 13.36s /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 13.30s /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 11.49s /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 fedora.linux_system_roles.storage : Include the appropriate provider tasks --- 9.13s /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 6.48s /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 fedora.linux_system_roles.storage : Get required packages --------------- 5.88s /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 5.87s /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 5.86s /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 fedora.linux_system_roles.storage : Get required packages --------------- 5.76s /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 5.71s /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 fedora.linux_system_roles.storage : Get required packages --------------- 5.68s /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 fedora.linux_system_roles.storage : Get required packages --------------- 5.66s /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 fedora.linux_system_roles.storage : Get required packages --------------- 5.54s /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 5.54s /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 fedora.linux_system_roles.storage : Get required packages --------------- 5.46s /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 fedora.linux_system_roles.storage : Get required packages --------------- 5.42s /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 5.39s /tmp/collections-HYp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88